ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Friday 17 April 2026 20:08:45 -0400 (0:00:00.348) 0:00:00.348 ********** ok: [managed-node12] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Friday 17 April 2026 20:08:49 -0400 (0:00:04.314) 0:00:04.662 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Friday 17 April 2026 20:08:50 -0400 (0:00:00.443) 0:00:05.105 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Friday 17 April 2026 20:08:50 -0400 (0:00:00.531) 0:00:05.637 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Friday 17 April 2026 20:08:51 -0400 (0:00:00.379) 0:00:06.017 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Friday 17 April 2026 20:08:51 -0400 (0:00:00.527) 0:00:06.544 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Friday 17 April 2026 20:08:51 -0400 (0:00:00.431) 0:00:06.976 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Friday 17 April 2026 20:08:52 -0400 (0:00:00.470) 0:00:07.446 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Friday 17 April 2026 20:08:52 -0400 (0:00:00.384) 0:00:07.831 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:08:53 -0400 (0:00:00.299) 0:00:08.131 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:08:53 -0400 (0:00:00.558) 0:00:08.690 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:08:54 -0400 (0:00:00.775) 0:00:09.465 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:08:57 -0400 (0:00:02.794) 0:00:12.260 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:08:57 -0400 (0:00:00.266) 0:00:12.526 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:08:59 -0400 (0:00:01.761) 0:00:14.288 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:08:59 -0400 (0:00:00.363) 0:00:14.651 ********** ok: [managed-node12] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:09:01 -0400 (0:00:01.907) 0:00:16.559 ********** ok: [managed-node12] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:09:01 -0400 (0:00:00.197) 0:00:16.756 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:09:01 -0400 (0:00:00.146) 0:00:16.903 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:09:02 -0400 (0:00:00.132) 0:00:17.035 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:09:02 -0400 (0:00:00.456) 0:00:17.492 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:09:02 -0400 (0:00:00.141) 0:00:17.633 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:09:02 -0400 (0:00:00.137) 0:00:17.771 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:09:07 -0400 (0:00:05.124) 0:00:22.895 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:09:08 -0400 (0:00:00.201) 0:00:23.097 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:09:08 -0400 (0:00:00.191) 0:00:23.289 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:09:11 -0400 (0:00:02.837) 0:00:26.126 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:09:11 -0400 (0:00:00.333) 0:00:26.459 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:09:11 -0400 (0:00:00.106) 0:00:26.566 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:09:11 -0400 (0:00:00.194) 0:00:26.761 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:09:11 -0400 (0:00:00.183) 0:00:26.944 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:09:16 -0400 (0:00:04.262) 0:00:31.206 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:09:20 -0400 (0:00:04.111) 0:00:35.318 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:09:20 -0400 (0:00:00.286) 0:00:35.605 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:09:22 -0400 (0:00:01.735) 0:00:37.340 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:09:22 -0400 (0:00:00.246) 0:00:37.587 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776470677.3587909, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776470675.8727849, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776470675.8727849, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:09:24 -0400 (0:00:01.449) 0:00:39.037 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:09:24 -0400 (0:00:00.227) 0:00:39.264 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:09:24 -0400 (0:00:00.288) 0:00:39.552 ********** ok: [managed-node12] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:09:24 -0400 (0:00:00.251) 0:00:39.804 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:09:25 -0400 (0:00:00.219) 0:00:40.024 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:09:25 -0400 (0:00:00.208) 0:00:40.233 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:09:25 -0400 (0:00:00.172) 0:00:40.405 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:09:25 -0400 (0:00:00.435) 0:00:40.840 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:09:26 -0400 (0:00:00.185) 0:00:41.026 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:09:26 -0400 (0:00:00.143) 0:00:41.169 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:09:26 -0400 (0:00:00.140) 0:00:41.310 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776469652.8030713, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:09:27 -0400 (0:00:01.295) 0:00:42.606 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:09:27 -0400 (0:00:00.102) 0:00:42.708 ********** ok: [managed-node12] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:75 Friday 17 April 2026 20:09:29 -0400 (0:00:01.785) 0:00:44.493 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node12 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Friday 17 April 2026 20:09:29 -0400 (0:00:00.337) 0:00:44.832 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Friday 17 April 2026 20:09:34 -0400 (0:00:04.252) 0:00:49.084 ********** ok: [managed-node12] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Friday 17 April 2026 20:09:36 -0400 (0:00:02.839) 0:00:51.923 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Friday 17 April 2026 20:09:37 -0400 (0:00:00.356) 0:00:52.280 ********** ok: [managed-node12] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Friday 17 April 2026 20:09:37 -0400 (0:00:00.270) 0:00:52.551 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Friday 17 April 2026 20:09:37 -0400 (0:00:00.183) 0:00:52.735 ********** ok: [managed-node12] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:84 Friday 17 April 2026 20:09:37 -0400 (0:00:00.165) 0:00:52.900 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:09:38 -0400 (0:00:00.239) 0:00:53.140 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:09:38 -0400 (0:00:00.074) 0:00:53.214 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:09:38 -0400 (0:00:00.222) 0:00:53.436 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:09:38 -0400 (0:00:00.163) 0:00:53.599 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:09:38 -0400 (0:00:00.261) 0:00:53.861 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:09:40 -0400 (0:00:01.782) 0:00:55.644 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:09:40 -0400 (0:00:00.109) 0:00:55.753 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:09:42 -0400 (0:00:01.464) 0:00:57.218 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:09:42 -0400 (0:00:00.225) 0:00:57.444 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:09:42 -0400 (0:00:00.135) 0:00:57.579 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:09:42 -0400 (0:00:00.083) 0:00:57.663 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:09:42 -0400 (0:00:00.071) 0:00:57.734 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:09:42 -0400 (0:00:00.096) 0:00:57.830 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:09:43 -0400 (0:00:00.376) 0:00:58.207 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:09:43 -0400 (0:00:00.360) 0:00:58.568 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:09:43 -0400 (0:00:00.159) 0:00:58.727 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:09:47 -0400 (0:00:04.148) 0:01:02.875 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:09:48 -0400 (0:00:00.199) 0:01:03.075 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:09:48 -0400 (0:00:00.225) 0:01:03.300 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:09:53 -0400 (0:00:05.477) 0:01:08.778 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:09:54 -0400 (0:00:00.355) 0:01:09.134 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:09:54 -0400 (0:00:00.158) 0:01:09.292 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:09:54 -0400 (0:00:00.192) 0:01:09.485 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:09:54 -0400 (0:00:00.170) 0:01:09.655 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:09:58 -0400 (0:00:03.595) 0:01:13.251 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:10:01 -0400 (0:00:02.905) 0:01:16.157 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:10:01 -0400 (0:00:00.389) 0:01:16.546 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:10:06 -0400 (0:00:05.146) 0:01:21.693 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:10:06 -0400 (0:00:00.285) 0:01:21.979 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:10:07 -0400 (0:00:00.464) 0:01:22.443 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:10:07 -0400 (0:00:00.235) 0:01:22.679 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:10:07 -0400 (0:00:00.326) 0:01:23.005 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:100 Friday 17 April 2026 20:10:08 -0400 (0:00:00.222) 0:01:23.228 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:10:08 -0400 (0:00:00.430) 0:01:23.658 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:10:08 -0400 (0:00:00.213) 0:01:23.872 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:10:09 -0400 (0:00:00.198) 0:01:24.070 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:10:10 -0400 (0:00:01.629) 0:01:25.700 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:10:10 -0400 (0:00:00.150) 0:01:25.850 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:10:12 -0400 (0:00:01.723) 0:01:27.574 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:10:12 -0400 (0:00:00.416) 0:01:27.990 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:10:13 -0400 (0:00:00.227) 0:01:28.217 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:10:13 -0400 (0:00:00.241) 0:01:28.459 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:10:13 -0400 (0:00:00.178) 0:01:28.637 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:10:13 -0400 (0:00:00.148) 0:01:28.786 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:10:14 -0400 (0:00:00.431) 0:01:29.217 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:10:14 -0400 (0:00:00.234) 0:01:29.452 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:10:14 -0400 (0:00:00.223) 0:01:29.675 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:10:18 -0400 (0:00:03.740) 0:01:33.415 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:10:18 -0400 (0:00:00.184) 0:01:33.599 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:10:18 -0400 (0:00:00.275) 0:01:33.875 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:10:24 -0400 (0:00:05.552) 0:01:39.427 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:10:24 -0400 (0:00:00.401) 0:01:39.829 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:10:24 -0400 (0:00:00.171) 0:01:40.001 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:10:25 -0400 (0:00:00.195) 0:01:40.196 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:10:25 -0400 (0:00:00.134) 0:01:40.331 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:10:29 -0400 (0:00:04.352) 0:01:44.684 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:10:32 -0400 (0:00:02.710) 0:01:47.394 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:10:32 -0400 (0:00:00.172) 0:01:47.566 ********** changed: [managed-node12] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:10:46 -0400 (0:00:13.599) 0:02:01.166 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:10:46 -0400 (0:00:00.224) 0:02:01.391 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776470677.3587909, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776470675.8727849, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776470675.8727849, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:10:47 -0400 (0:00:01.178) 0:02:02.570 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:10:49 -0400 (0:00:02.303) 0:02:04.874 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:10:50 -0400 (0:00:00.378) 0:02:05.252 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:10:50 -0400 (0:00:00.299) 0:02:05.552 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:10:50 -0400 (0:00:00.262) 0:02:05.815 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:10:51 -0400 (0:00:00.257) 0:02:06.072 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:10:51 -0400 (0:00:00.095) 0:02:06.167 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:10:53 -0400 (0:00:02.663) 0:02:08.831 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:10:56 -0400 (0:00:02.680) 0:02:11.511 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:10:56 -0400 (0:00:00.266) 0:02:11.777 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:10:58 -0400 (0:00:01.596) 0:02:13.373 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776469652.8030713, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:10:59 -0400 (0:00:01.281) 0:02:14.655 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda', 'name': 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:11:01 -0400 (0:00:01.738) 0:02:16.393 ********** ok: [managed-node12] TASK [Verify role results] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:112 Friday 17 April 2026 20:11:02 -0400 (0:00:01.549) 0:02:17.943 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:11:03 -0400 (0:00:00.380) 0:02:18.324 ********** skipping: [managed-node12] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:11:03 -0400 (0:00:00.148) 0:02:18.473 ********** ok: [managed-node12] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:11:03 -0400 (0:00:00.195) 0:02:18.668 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "size": "10G", "type": "crypt", "uuid": "3db6923c-601e-47c8-b56f-15417725c3dd" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:11:06 -0400 (0:00:02.494) 0:02:21.162 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002402", "end": "2026-04-17 20:11:08.234879", "rc": 0, "start": "2026-04-17 20:11:08.232477" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:11:08 -0400 (0:00:02.305) 0:02:23.468 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002768", "end": "2026-04-17 20:11:09.591664", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:11:09.588896" } STDOUT: luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:11:09 -0400 (0:00:01.364) 0:02:24.833 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:11:09 -0400 (0:00:00.157) 0:02:24.990 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:11:10 -0400 (0:00:00.267) 0:02:25.258 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:11:10 -0400 (0:00:00.165) 0:02:25.424 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:11:11 -0400 (0:00:01.000) 0:02:26.424 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:11:11 -0400 (0:00:00.206) 0:02:26.631 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:11:11 -0400 (0:00:00.296) 0:02:26.927 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:11:12 -0400 (0:00:00.355) 0:02:27.282 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:11:12 -0400 (0:00:00.665) 0:02:27.948 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:11:13 -0400 (0:00:00.280) 0:02:28.229 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:11:13 -0400 (0:00:00.276) 0:02:28.506 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:11:13 -0400 (0:00:00.266) 0:02:28.772 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:11:13 -0400 (0:00:00.159) 0:02:28.932 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:11:14 -0400 (0:00:00.246) 0:02:29.179 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:11:14 -0400 (0:00:00.187) 0:02:29.367 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:11:14 -0400 (0:00:00.141) 0:02:29.509 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:11:15 -0400 (0:00:00.634) 0:02:30.144 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:11:15 -0400 (0:00:00.318) 0:02:30.462 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:11:15 -0400 (0:00:00.253) 0:02:30.715 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:11:15 -0400 (0:00:00.240) 0:02:30.956 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:11:16 -0400 (0:00:00.260) 0:02:31.217 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:11:16 -0400 (0:00:00.155) 0:02:31.372 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:11:16 -0400 (0:00:00.323) 0:02:31.696 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:11:16 -0400 (0:00:00.312) 0:02:32.009 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471045.7232807, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471045.7232807, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36443, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471045.7232807, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:11:18 -0400 (0:00:01.146) 0:02:33.156 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:11:18 -0400 (0:00:00.248) 0:02:33.404 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:11:18 -0400 (0:00:00.261) 0:02:33.665 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:11:18 -0400 (0:00:00.226) 0:02:33.892 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:11:19 -0400 (0:00:00.221) 0:02:34.113 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:11:19 -0400 (0:00:00.241) 0:02:34.355 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:11:19 -0400 (0:00:00.212) 0:02:34.567 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471045.8532813, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471045.8532813, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 186469, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471045.8532813, "nlink": 1, "path": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:11:21 -0400 (0:00:01.650) 0:02:36.217 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:11:25 -0400 (0:00:04.356) 0:02:40.574 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011464", "end": "2026-04-17 20:11:26.897687", "rc": 0, "start": "2026-04-17 20:11:26.886223" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 5d65d7e3-39d1-4eab-ace2-6c0e6b24faac Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 926692 Threads: 2 Salt: a9 39 0e bf 93 eb bb e8 c0 a6 40 c4 2d 16 1e 10 59 2c fb f1 79 6c 89 40 f5 1b f9 13 af d0 58 16 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 13 1f c2 a7 bb fd ca cc d7 9b 07 d6 2d f6 e2 0b e1 b2 ca ae b6 13 e1 62 25 22 4b 4a f4 c1 1f c1 Digest: ab 0e a2 cb 66 2b 39 8b 05 3f 0c 3c d2 ae f5 b4 1e de 0c 77 aa c4 fc 71 dd dc 75 6f 30 37 7c 1d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:11:27 -0400 (0:00:01.543) 0:02:42.117 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:11:27 -0400 (0:00:00.404) 0:02:42.522 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:11:27 -0400 (0:00:00.484) 0:02:43.006 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:11:28 -0400 (0:00:00.249) 0:02:43.256 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:11:28 -0400 (0:00:00.331) 0:02:43.587 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:11:29 -0400 (0:00:00.535) 0:02:44.122 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:11:29 -0400 (0:00:00.248) 0:02:44.371 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:11:29 -0400 (0:00:00.232) 0:02:44.604 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:11:29 -0400 (0:00:00.328) 0:02:44.933 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:11:30 -0400 (0:00:00.195) 0:02:45.129 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:11:30 -0400 (0:00:00.268) 0:02:45.398 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:11:30 -0400 (0:00:00.376) 0:02:45.774 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:11:31 -0400 (0:00:00.430) 0:02:46.205 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:11:31 -0400 (0:00:00.172) 0:02:46.377 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:11:31 -0400 (0:00:00.313) 0:02:46.691 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:11:31 -0400 (0:00:00.273) 0:02:46.964 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:11:32 -0400 (0:00:00.205) 0:02:47.169 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:11:32 -0400 (0:00:00.313) 0:02:47.482 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:11:32 -0400 (0:00:00.272) 0:02:47.755 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:11:32 -0400 (0:00:00.227) 0:02:47.983 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:11:33 -0400 (0:00:00.166) 0:02:48.150 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:11:33 -0400 (0:00:00.168) 0:02:48.318 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:11:33 -0400 (0:00:00.248) 0:02:48.567 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:11:33 -0400 (0:00:00.181) 0:02:48.748 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:11:33 -0400 (0:00:00.218) 0:02:48.967 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:11:34 -0400 (0:00:00.283) 0:02:49.250 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:11:34 -0400 (0:00:00.163) 0:02:49.414 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:11:34 -0400 (0:00:00.242) 0:02:49.656 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:11:34 -0400 (0:00:00.326) 0:02:49.983 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:11:35 -0400 (0:00:00.300) 0:02:50.284 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:11:35 -0400 (0:00:00.196) 0:02:50.480 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:11:35 -0400 (0:00:00.225) 0:02:50.706 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:11:35 -0400 (0:00:00.293) 0:02:51.000 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:11:36 -0400 (0:00:00.263) 0:02:51.263 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:11:36 -0400 (0:00:00.361) 0:02:51.625 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:11:36 -0400 (0:00:00.136) 0:02:51.762 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:11:36 -0400 (0:00:00.180) 0:02:51.942 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:11:37 -0400 (0:00:00.138) 0:02:52.080 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:11:37 -0400 (0:00:00.224) 0:02:52.305 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:11:37 -0400 (0:00:00.263) 0:02:52.569 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:11:37 -0400 (0:00:00.212) 0:02:52.781 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:11:38 -0400 (0:00:00.260) 0:02:53.041 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:11:38 -0400 (0:00:00.250) 0:02:53.292 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:11:38 -0400 (0:00:00.228) 0:02:53.520 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:11:38 -0400 (0:00:00.247) 0:02:53.767 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:11:38 -0400 (0:00:00.215) 0:02:53.983 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:11:39 -0400 (0:00:00.218) 0:02:54.202 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:11:39 -0400 (0:00:00.342) 0:02:54.544 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:11:39 -0400 (0:00:00.290) 0:02:54.834 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:11:40 -0400 (0:00:00.341) 0:02:55.176 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:11:40 -0400 (0:00:00.325) 0:02:55.501 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:11:40 -0400 (0:00:00.369) 0:02:55.871 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:11:41 -0400 (0:00:00.235) 0:02:56.106 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:11:41 -0400 (0:00:00.395) 0:02:56.502 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:11:41 -0400 (0:00:00.219) 0:02:56.721 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:11:41 -0400 (0:00:00.156) 0:02:56.878 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:11:42 -0400 (0:00:00.248) 0:02:57.126 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:11:42 -0400 (0:00:00.232) 0:02:57.358 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:11:42 -0400 (0:00:00.147) 0:02:57.506 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:11:42 -0400 (0:00:00.127) 0:02:57.633 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:118 Friday 17 April 2026 20:11:46 -0400 (0:00:03.420) 0:03:01.054 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:11:46 -0400 (0:00:00.518) 0:03:01.572 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:11:46 -0400 (0:00:00.202) 0:03:01.775 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:11:47 -0400 (0:00:00.259) 0:03:02.034 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:11:47 -0400 (0:00:00.262) 0:03:02.297 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:11:47 -0400 (0:00:00.291) 0:03:02.588 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:11:49 -0400 (0:00:02.045) 0:03:04.634 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:11:49 -0400 (0:00:00.247) 0:03:04.882 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:11:52 -0400 (0:00:02.208) 0:03:07.091 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:11:52 -0400 (0:00:00.435) 0:03:07.526 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:11:53 -0400 (0:00:00.625) 0:03:08.151 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:11:53 -0400 (0:00:00.213) 0:03:08.364 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:11:53 -0400 (0:00:00.202) 0:03:08.567 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:11:53 -0400 (0:00:00.241) 0:03:08.809 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:11:54 -0400 (0:00:00.497) 0:03:09.306 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:11:54 -0400 (0:00:00.169) 0:03:09.476 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:11:54 -0400 (0:00:00.272) 0:03:09.749 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:11:59 -0400 (0:00:04.764) 0:03:14.513 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:11:59 -0400 (0:00:00.290) 0:03:14.804 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:12:00 -0400 (0:00:00.278) 0:03:15.082 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:12:06 -0400 (0:00:06.007) 0:03:21.089 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:12:06 -0400 (0:00:00.352) 0:03:21.442 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:12:06 -0400 (0:00:00.222) 0:03:21.665 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:12:06 -0400 (0:00:00.252) 0:03:21.917 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:12:07 -0400 (0:00:00.138) 0:03:22.055 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:12:11 -0400 (0:00:04.728) 0:03:26.783 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:12:14 -0400 (0:00:03.052) 0:03:29.836 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:12:15 -0400 (0:00:00.288) 0:03:30.124 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:12:20 -0400 (0:00:05.552) 0:03:35.677 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:12:21 -0400 (0:00:00.364) 0:03:36.042 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:12:21 -0400 (0:00:00.232) 0:03:36.274 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:12:21 -0400 (0:00:00.222) 0:03:36.497 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:12:21 -0400 (0:00:00.298) 0:03:36.796 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:12:21 -0400 (0:00:00.176) 0:03:36.972 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471105.6825264, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471105.6825264, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471105.6825264, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "957440057", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:12:23 -0400 (0:00:01.239) 0:03:38.212 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:139 Friday 17 April 2026 20:12:23 -0400 (0:00:00.193) 0:03:38.406 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:12:23 -0400 (0:00:00.243) 0:03:38.649 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:12:23 -0400 (0:00:00.144) 0:03:38.793 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:12:23 -0400 (0:00:00.100) 0:03:38.894 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:12:25 -0400 (0:00:01.368) 0:03:40.262 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:12:25 -0400 (0:00:00.139) 0:03:40.402 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:12:26 -0400 (0:00:01.556) 0:03:41.958 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:12:27 -0400 (0:00:00.419) 0:03:42.378 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:12:27 -0400 (0:00:00.236) 0:03:42.614 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:12:27 -0400 (0:00:00.236) 0:03:42.851 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:12:28 -0400 (0:00:00.228) 0:03:43.080 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:12:28 -0400 (0:00:00.182) 0:03:43.263 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:12:28 -0400 (0:00:00.306) 0:03:43.569 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:12:28 -0400 (0:00:00.275) 0:03:43.845 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:12:29 -0400 (0:00:00.218) 0:03:44.063 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:12:32 -0400 (0:00:03.812) 0:03:47.876 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:12:33 -0400 (0:00:00.231) 0:03:48.108 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:12:33 -0400 (0:00:00.190) 0:03:48.299 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:12:38 -0400 (0:00:05.431) 0:03:53.730 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:12:39 -0400 (0:00:00.360) 0:03:54.091 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:12:39 -0400 (0:00:00.134) 0:03:54.226 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:12:39 -0400 (0:00:00.214) 0:03:54.440 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:12:39 -0400 (0:00:00.195) 0:03:54.636 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:12:43 -0400 (0:00:03.819) 0:03:58.455 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:12:46 -0400 (0:00:02.845) 0:04:01.300 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:12:46 -0400 (0:00:00.190) 0:04:01.491 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:12:52 -0400 (0:00:05.712) 0:04:07.204 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:12:52 -0400 (0:00:00.194) 0:04:07.398 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471056.1603234, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9a1bcdea3a9cfc9f5af2963b6383880a10bb0f31", "ctime": 1776471056.1573234, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471056.1573234, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:12:53 -0400 (0:00:01.459) 0:04:08.857 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:12:55 -0400 (0:00:01.409) 0:04:10.267 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:12:55 -0400 (0:00:00.368) 0:04:10.636 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:12:55 -0400 (0:00:00.245) 0:04:10.881 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:12:56 -0400 (0:00:00.228) 0:04:11.110 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:12:56 -0400 (0:00:00.221) 0:04:11.331 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:12:57 -0400 (0:00:01.414) 0:04:12.745 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:12:59 -0400 (0:00:01.676) 0:04:14.422 ********** changed: [managed-node12] => (item={'src': 'UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:13:00 -0400 (0:00:01.518) 0:04:15.940 ********** skipping: [managed-node12] => (item={'src': 'UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:13:01 -0400 (0:00:00.203) 0:04:16.144 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:13:02 -0400 (0:00:01.283) 0:04:17.428 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471069.5903785, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5fec616fc59c5c575648c46e8287b76966e094df", "ctime": 1776471061.114344, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 490733706, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471061.1133437, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3998251760", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:13:03 -0400 (0:00:01.259) 0:04:18.687 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda', 'name': 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:13:05 -0400 (0:00:01.389) 0:04:20.076 ********** ok: [managed-node12] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:152 Friday 17 April 2026 20:13:06 -0400 (0:00:01.323) 0:04:21.400 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:13:06 -0400 (0:00:00.218) 0:04:21.618 ********** skipping: [managed-node12] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:13:06 -0400 (0:00:00.116) 0:04:21.735 ********** ok: [managed-node12] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:13:06 -0400 (0:00:00.224) 0:04:21.959 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:13:08 -0400 (0:00:01.131) 0:04:23.091 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002831", "end": "2026-04-17 20:13:08.895694", "rc": 0, "start": "2026-04-17 20:13:08.892863" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:13:09 -0400 (0:00:01.035) 0:04:24.126 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003683", "end": "2026-04-17 20:13:11.068853", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:13:10.065170" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:13:11 -0400 (0:00:02.154) 0:04:26.281 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:13:11 -0400 (0:00:00.084) 0:04:26.365 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:13:11 -0400 (0:00:00.332) 0:04:26.698 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:13:11 -0400 (0:00:00.250) 0:04:26.948 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:13:12 -0400 (0:00:00.848) 0:04:27.797 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:13:12 -0400 (0:00:00.197) 0:04:27.995 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:13:13 -0400 (0:00:00.178) 0:04:28.173 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:13:13 -0400 (0:00:00.271) 0:04:28.445 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:13:13 -0400 (0:00:00.280) 0:04:28.725 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:13:14 -0400 (0:00:00.293) 0:04:29.018 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:13:14 -0400 (0:00:00.273) 0:04:29.292 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:13:14 -0400 (0:00:00.266) 0:04:29.559 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:13:14 -0400 (0:00:00.167) 0:04:29.726 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:13:14 -0400 (0:00:00.223) 0:04:29.950 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:13:15 -0400 (0:00:00.184) 0:04:30.134 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:13:15 -0400 (0:00:00.232) 0:04:30.367 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:13:16 -0400 (0:00:00.689) 0:04:31.057 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:13:16 -0400 (0:00:00.246) 0:04:31.303 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:13:16 -0400 (0:00:00.320) 0:04:31.623 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:13:16 -0400 (0:00:00.304) 0:04:31.928 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:13:17 -0400 (0:00:00.270) 0:04:32.199 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:13:17 -0400 (0:00:00.261) 0:04:32.460 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:13:17 -0400 (0:00:00.344) 0:04:32.805 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:13:18 -0400 (0:00:00.308) 0:04:33.113 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471171.9067974, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471171.9067974, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36443, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471171.9067974, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:13:19 -0400 (0:00:01.438) 0:04:34.552 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:13:19 -0400 (0:00:00.361) 0:04:34.913 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:13:20 -0400 (0:00:00.284) 0:04:35.197 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:13:20 -0400 (0:00:00.277) 0:04:35.475 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:13:20 -0400 (0:00:00.290) 0:04:35.766 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:13:21 -0400 (0:00:00.269) 0:04:36.035 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:13:21 -0400 (0:00:00.218) 0:04:36.253 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:13:21 -0400 (0:00:00.294) 0:04:36.548 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:13:25 -0400 (0:00:04.451) 0:04:41.000 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:13:26 -0400 (0:00:00.234) 0:04:41.235 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:13:26 -0400 (0:00:00.189) 0:04:41.424 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:13:26 -0400 (0:00:00.365) 0:04:41.790 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:13:27 -0400 (0:00:00.259) 0:04:42.049 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:13:27 -0400 (0:00:00.303) 0:04:42.352 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:13:27 -0400 (0:00:00.272) 0:04:42.625 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:13:27 -0400 (0:00:00.257) 0:04:42.883 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:13:28 -0400 (0:00:00.309) 0:04:43.193 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:13:28 -0400 (0:00:00.372) 0:04:43.565 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:13:28 -0400 (0:00:00.348) 0:04:43.914 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:13:29 -0400 (0:00:00.281) 0:04:44.195 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:13:29 -0400 (0:00:00.223) 0:04:44.419 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:13:29 -0400 (0:00:00.259) 0:04:44.678 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:13:29 -0400 (0:00:00.237) 0:04:44.916 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:13:30 -0400 (0:00:00.249) 0:04:45.166 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:13:30 -0400 (0:00:00.272) 0:04:45.439 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:13:30 -0400 (0:00:00.284) 0:04:45.723 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:13:30 -0400 (0:00:00.200) 0:04:45.924 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:13:31 -0400 (0:00:00.199) 0:04:46.123 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:13:31 -0400 (0:00:00.224) 0:04:46.348 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:13:31 -0400 (0:00:00.307) 0:04:46.655 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:13:31 -0400 (0:00:00.176) 0:04:46.832 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:13:32 -0400 (0:00:00.207) 0:04:47.039 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:13:32 -0400 (0:00:00.229) 0:04:47.268 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:13:32 -0400 (0:00:00.309) 0:04:47.578 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:13:32 -0400 (0:00:00.274) 0:04:47.852 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:13:33 -0400 (0:00:00.286) 0:04:48.138 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:13:33 -0400 (0:00:00.287) 0:04:48.426 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:13:33 -0400 (0:00:00.296) 0:04:48.722 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:13:33 -0400 (0:00:00.281) 0:04:49.004 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:13:34 -0400 (0:00:00.220) 0:04:49.225 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:13:34 -0400 (0:00:00.279) 0:04:49.505 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:13:34 -0400 (0:00:00.184) 0:04:49.690 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:13:34 -0400 (0:00:00.209) 0:04:49.899 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:13:35 -0400 (0:00:00.238) 0:04:50.137 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:13:35 -0400 (0:00:00.221) 0:04:50.359 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:13:35 -0400 (0:00:00.240) 0:04:50.599 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:13:35 -0400 (0:00:00.226) 0:04:50.825 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:13:36 -0400 (0:00:00.238) 0:04:51.064 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:13:36 -0400 (0:00:00.271) 0:04:51.335 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:13:36 -0400 (0:00:00.300) 0:04:51.635 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:13:37 -0400 (0:00:00.428) 0:04:52.063 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:13:37 -0400 (0:00:00.303) 0:04:52.366 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:13:37 -0400 (0:00:00.309) 0:04:52.676 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:13:37 -0400 (0:00:00.232) 0:04:52.908 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:13:38 -0400 (0:00:00.267) 0:04:53.176 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:13:38 -0400 (0:00:00.196) 0:04:53.372 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:13:38 -0400 (0:00:00.351) 0:04:53.724 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:13:38 -0400 (0:00:00.258) 0:04:53.982 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:13:39 -0400 (0:00:00.318) 0:04:54.301 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:13:39 -0400 (0:00:00.278) 0:04:54.579 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:13:39 -0400 (0:00:00.345) 0:04:54.925 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:13:40 -0400 (0:00:00.239) 0:04:55.164 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:13:40 -0400 (0:00:00.275) 0:04:55.440 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:13:40 -0400 (0:00:00.205) 0:04:55.645 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:13:40 -0400 (0:00:00.203) 0:04:55.849 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:13:41 -0400 (0:00:00.224) 0:04:56.074 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:13:41 -0400 (0:00:00.144) 0:04:56.219 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:13:41 -0400 (0:00:00.186) 0:04:56.405 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:13:41 -0400 (0:00:00.138) 0:04:56.544 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:158 Friday 17 April 2026 20:13:42 -0400 (0:00:01.441) 0:04:57.986 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:13:43 -0400 (0:00:00.502) 0:04:58.488 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:13:43 -0400 (0:00:00.217) 0:04:58.705 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:13:43 -0400 (0:00:00.205) 0:04:58.911 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:13:44 -0400 (0:00:00.154) 0:04:59.066 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:13:44 -0400 (0:00:00.181) 0:04:59.247 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:13:45 -0400 (0:00:01.759) 0:05:01.007 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:13:46 -0400 (0:00:00.197) 0:05:01.204 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:13:47 -0400 (0:00:01.567) 0:05:02.772 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:13:48 -0400 (0:00:00.289) 0:05:03.061 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:13:48 -0400 (0:00:00.223) 0:05:03.284 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:13:48 -0400 (0:00:00.283) 0:05:03.568 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:13:48 -0400 (0:00:00.130) 0:05:03.699 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:13:48 -0400 (0:00:00.155) 0:05:03.855 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:13:49 -0400 (0:00:00.620) 0:05:04.475 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:13:49 -0400 (0:00:00.251) 0:05:04.727 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:13:49 -0400 (0:00:00.204) 0:05:04.931 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:13:54 -0400 (0:00:04.675) 0:05:09.606 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:13:54 -0400 (0:00:00.299) 0:05:09.906 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:13:55 -0400 (0:00:00.304) 0:05:10.210 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:14:00 -0400 (0:00:05.296) 0:05:15.507 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:14:00 -0400 (0:00:00.406) 0:05:15.914 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:14:01 -0400 (0:00:00.184) 0:05:16.098 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:14:01 -0400 (0:00:00.273) 0:05:16.372 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:14:01 -0400 (0:00:00.180) 0:05:16.553 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:14:05 -0400 (0:00:04.349) 0:05:20.902 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service": { "name": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service": { "name": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:14:09 -0400 (0:00:03.274) 0:05:24.176 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d5d65d7e3\x2d39d1\x2d4eab\x2dace2\x2d6c0e6b24faac.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "name": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:13:02 EDT", "StateChangeTimestampMonotonic": "1817209226", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...d39d1\x2d4eab\x2dace2\x2d6c0e6b24faac.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "name": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:14:13 -0400 (0:00:03.926) 0:05:28.103 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:14:18 -0400 (0:00:05.199) 0:05:33.302 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:14:18 -0400 (0:00:00.227) 0:05:33.529 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d5d65d7e3\x2d39d1\x2d4eab\x2dace2\x2d6c0e6b24faac.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "name": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d5d65d7e3\\x2d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...d39d1\x2d4eab\x2dace2\x2d6c0e6b24faac.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "name": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d39d1\\x2d4eab\\x2dace2\\x2d6c0e6b24faac.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:14:21 -0400 (0:00:02.989) 0:05:36.524 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:14:21 -0400 (0:00:00.095) 0:05:36.620 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:14:21 -0400 (0:00:00.174) 0:05:36.795 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:14:21 -0400 (0:00:00.080) 0:05:36.875 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471222.7300055, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471222.7300055, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471222.7300055, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2732330043", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:14:22 -0400 (0:00:01.130) 0:05:38.005 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:179 Friday 17 April 2026 20:14:23 -0400 (0:00:00.167) 0:05:38.173 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:14:23 -0400 (0:00:00.309) 0:05:38.482 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:14:23 -0400 (0:00:00.296) 0:05:38.779 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:14:23 -0400 (0:00:00.188) 0:05:38.968 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:14:25 -0400 (0:00:01.579) 0:05:40.547 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:14:25 -0400 (0:00:00.183) 0:05:40.730 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:14:27 -0400 (0:00:01.773) 0:05:42.504 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:14:27 -0400 (0:00:00.370) 0:05:42.874 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:14:28 -0400 (0:00:00.225) 0:05:43.100 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:14:28 -0400 (0:00:00.166) 0:05:43.267 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:14:28 -0400 (0:00:00.302) 0:05:43.569 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:14:28 -0400 (0:00:00.101) 0:05:43.670 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:14:29 -0400 (0:00:00.530) 0:05:44.201 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:14:29 -0400 (0:00:00.273) 0:05:44.474 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:14:29 -0400 (0:00:00.224) 0:05:44.699 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:14:33 -0400 (0:00:04.281) 0:05:48.980 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:14:34 -0400 (0:00:00.183) 0:05:49.163 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:14:34 -0400 (0:00:00.103) 0:05:49.267 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:14:38 -0400 (0:00:04.578) 0:05:53.846 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:14:39 -0400 (0:00:00.173) 0:05:54.019 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:14:39 -0400 (0:00:00.097) 0:05:54.117 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:14:39 -0400 (0:00:00.174) 0:05:54.292 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:14:39 -0400 (0:00:00.083) 0:05:54.375 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:14:42 -0400 (0:00:03.175) 0:05:57.551 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:14:45 -0400 (0:00:03.007) 0:06:00.558 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:14:45 -0400 (0:00:00.318) 0:06:00.877 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:14:59 -0400 (0:00:13.465) 0:06:14.343 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:14:59 -0400 (0:00:00.147) 0:06:14.490 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471180.6678333, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "65e47a00de207ac8bbc438ac7726b4ee18b2cf15", "ctime": 1776471180.6648333, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471180.6648333, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:15:00 -0400 (0:00:01.459) 0:06:15.950 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:15:02 -0400 (0:00:01.463) 0:06:17.414 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:15:02 -0400 (0:00:00.096) 0:06:17.511 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:15:02 -0400 (0:00:00.167) 0:06:17.678 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:15:02 -0400 (0:00:00.264) 0:06:17.943 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:15:03 -0400 (0:00:00.233) 0:06:18.176 ********** changed: [managed-node12] => (item={'src': 'UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=4db77d7c-a88d-4e2a-a1b2-6e1b48cb687f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:15:04 -0400 (0:00:01.263) 0:06:19.439 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:15:05 -0400 (0:00:01.203) 0:06:20.643 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:15:06 -0400 (0:00:01.269) 0:06:21.912 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:15:07 -0400 (0:00:00.287) 0:06:22.200 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:15:08 -0400 (0:00:01.683) 0:06:23.884 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471190.066872, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471184.8858507, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 100663494, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776471184.8848505, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "712889311", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:15:10 -0400 (0:00:01.310) 0:06:25.195 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda', 'name': 'luks-4b7196b5-6e15-4946-9038-d82a70956529', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:15:11 -0400 (0:00:01.457) 0:06:26.652 ********** ok: [managed-node12] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:192 Friday 17 April 2026 20:15:13 -0400 (0:00:01.955) 0:06:28.608 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:15:13 -0400 (0:00:00.364) 0:06:28.972 ********** skipping: [managed-node12] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:15:14 -0400 (0:00:00.126) 0:06:29.099 ********** ok: [managed-node12] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:15:14 -0400 (0:00:00.196) 0:06:29.295 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "size": "10G", "type": "crypt", "uuid": "4da1274b-06a9-4eb8-9eaa-ebcac6c63b68" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4b7196b5-6e15-4946-9038-d82a70956529" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:15:15 -0400 (0:00:01.587) 0:06:30.883 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002890", "end": "2026-04-17 20:15:17.029865", "rc": 0, "start": "2026-04-17 20:15:17.026975" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:15:17 -0400 (0:00:01.410) 0:06:32.293 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002660", "end": "2026-04-17 20:15:18.473981", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:15:18.471321" } STDOUT: luks-4b7196b5-6e15-4946-9038-d82a70956529 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:15:18 -0400 (0:00:01.354) 0:06:33.648 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:15:18 -0400 (0:00:00.253) 0:06:33.901 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:15:19 -0400 (0:00:00.307) 0:06:34.209 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:15:19 -0400 (0:00:00.247) 0:06:34.457 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:15:20 -0400 (0:00:00.794) 0:06:35.251 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:15:20 -0400 (0:00:00.148) 0:06:35.400 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:15:20 -0400 (0:00:00.134) 0:06:35.534 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:15:20 -0400 (0:00:00.365) 0:06:35.899 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:15:21 -0400 (0:00:00.220) 0:06:36.120 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:15:21 -0400 (0:00:00.236) 0:06:36.357 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:15:21 -0400 (0:00:00.163) 0:06:36.521 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:15:21 -0400 (0:00:00.217) 0:06:36.738 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:15:21 -0400 (0:00:00.236) 0:06:36.974 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:15:22 -0400 (0:00:00.144) 0:06:37.119 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:15:22 -0400 (0:00:00.175) 0:06:37.294 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:15:22 -0400 (0:00:00.134) 0:06:37.428 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:15:22 -0400 (0:00:00.423) 0:06:37.852 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:15:23 -0400 (0:00:00.186) 0:06:38.038 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:15:23 -0400 (0:00:00.209) 0:06:38.248 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:15:23 -0400 (0:00:00.308) 0:06:38.556 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:15:23 -0400 (0:00:00.221) 0:06:38.778 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:15:23 -0400 (0:00:00.137) 0:06:38.915 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:15:24 -0400 (0:00:00.370) 0:06:39.286 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:15:24 -0400 (0:00:00.274) 0:06:39.561 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471298.9173174, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471298.9173174, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36443, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471298.9173174, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:15:26 -0400 (0:00:01.803) 0:06:41.364 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:15:26 -0400 (0:00:00.232) 0:06:41.597 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:15:26 -0400 (0:00:00.263) 0:06:41.861 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:15:27 -0400 (0:00:00.292) 0:06:42.154 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:15:27 -0400 (0:00:00.181) 0:06:42.335 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:15:27 -0400 (0:00:00.325) 0:06:42.660 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:15:27 -0400 (0:00:00.315) 0:06:42.976 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471299.0563178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471299.0563178, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 216414, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471299.0563178, "nlink": 1, "path": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:15:29 -0400 (0:00:01.646) 0:06:44.623 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:15:33 -0400 (0:00:04.118) 0:06:48.741 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010385", "end": "2026-04-17 20:15:34.974169", "rc": 0, "start": "2026-04-17 20:15:34.963784" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 4b7196b5-6e15-4946-9038-d82a70956529 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 910220 Threads: 2 Salt: 67 af b5 b7 80 38 90 38 1c 9a 2e fb ba 3a 92 30 c6 69 8e 8a 48 ec ec af 31 28 7b 67 61 ec 3d 1f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 40 1b 41 d4 ba 2f 36 9d fb cc d9 b4 ec ab ae 9a d8 77 4a 55 d1 44 1f ee a5 7a 46 af bd 80 dd 9a Digest: 76 ac 8f 44 ab bc 04 2c 19 f3 a0 76 f8 3c 8a 4e 79 6b c7 3f 06 5f aa 9a 9c 5d d4 a4 65 9f 1a b3 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:15:35 -0400 (0:00:01.518) 0:06:50.260 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:15:35 -0400 (0:00:00.245) 0:06:50.505 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:15:35 -0400 (0:00:00.283) 0:06:50.789 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:15:35 -0400 (0:00:00.185) 0:06:50.975 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:15:36 -0400 (0:00:00.247) 0:06:51.222 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:15:36 -0400 (0:00:00.385) 0:06:51.608 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:15:36 -0400 (0:00:00.243) 0:06:51.852 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:15:37 -0400 (0:00:00.232) 0:06:52.084 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4b7196b5-6e15-4946-9038-d82a70956529 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:15:37 -0400 (0:00:00.259) 0:06:52.343 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:15:37 -0400 (0:00:00.174) 0:06:52.518 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:15:37 -0400 (0:00:00.185) 0:06:52.703 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:15:38 -0400 (0:00:00.327) 0:06:53.030 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:15:38 -0400 (0:00:00.270) 0:06:53.301 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:15:38 -0400 (0:00:00.148) 0:06:53.450 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:15:38 -0400 (0:00:00.122) 0:06:53.573 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:15:38 -0400 (0:00:00.202) 0:06:53.775 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:15:38 -0400 (0:00:00.116) 0:06:53.891 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:15:39 -0400 (0:00:00.184) 0:06:54.075 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:15:39 -0400 (0:00:00.224) 0:06:54.300 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:15:39 -0400 (0:00:00.242) 0:06:54.543 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:15:39 -0400 (0:00:00.053) 0:06:54.597 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:15:39 -0400 (0:00:00.083) 0:06:54.680 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:15:39 -0400 (0:00:00.204) 0:06:54.885 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:15:40 -0400 (0:00:00.132) 0:06:55.018 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:15:40 -0400 (0:00:00.227) 0:06:55.245 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:15:40 -0400 (0:00:00.243) 0:06:55.489 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:15:40 -0400 (0:00:00.174) 0:06:55.664 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:15:40 -0400 (0:00:00.276) 0:06:55.940 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:15:41 -0400 (0:00:00.141) 0:06:56.081 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:15:41 -0400 (0:00:00.168) 0:06:56.250 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:15:41 -0400 (0:00:00.269) 0:06:56.519 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:15:41 -0400 (0:00:00.282) 0:06:56.801 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:15:42 -0400 (0:00:00.250) 0:06:57.052 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:15:42 -0400 (0:00:00.265) 0:06:57.318 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:15:42 -0400 (0:00:00.235) 0:06:57.553 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:15:42 -0400 (0:00:00.232) 0:06:57.786 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:15:43 -0400 (0:00:00.321) 0:06:58.108 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:15:43 -0400 (0:00:00.231) 0:06:58.340 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:15:43 -0400 (0:00:00.257) 0:06:58.597 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:15:43 -0400 (0:00:00.372) 0:06:58.970 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:15:44 -0400 (0:00:00.167) 0:06:59.137 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:15:44 -0400 (0:00:00.250) 0:06:59.388 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:15:44 -0400 (0:00:00.239) 0:06:59.628 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:15:44 -0400 (0:00:00.157) 0:06:59.786 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:15:45 -0400 (0:00:00.328) 0:07:00.114 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:15:45 -0400 (0:00:00.302) 0:07:00.417 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:15:45 -0400 (0:00:00.296) 0:07:00.713 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:15:45 -0400 (0:00:00.280) 0:07:00.994 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:15:46 -0400 (0:00:00.321) 0:07:01.316 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:15:46 -0400 (0:00:00.155) 0:07:01.472 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:15:46 -0400 (0:00:00.214) 0:07:01.686 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:15:46 -0400 (0:00:00.129) 0:07:01.815 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:15:46 -0400 (0:00:00.198) 0:07:02.013 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:15:47 -0400 (0:00:00.108) 0:07:02.122 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:15:47 -0400 (0:00:00.155) 0:07:02.278 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:15:47 -0400 (0:00:00.183) 0:07:02.462 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:15:47 -0400 (0:00:00.197) 0:07:02.660 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:15:47 -0400 (0:00:00.180) 0:07:02.840 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:15:48 -0400 (0:00:00.198) 0:07:03.038 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:199 Friday 17 April 2026 20:15:48 -0400 (0:00:00.221) 0:07:03.260 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:15:48 -0400 (0:00:00.454) 0:07:03.714 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:15:48 -0400 (0:00:00.257) 0:07:03.972 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:15:49 -0400 (0:00:00.366) 0:07:04.339 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:15:49 -0400 (0:00:00.229) 0:07:04.568 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:15:49 -0400 (0:00:00.235) 0:07:04.803 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:15:51 -0400 (0:00:01.721) 0:07:06.525 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:15:51 -0400 (0:00:00.267) 0:07:06.793 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:15:53 -0400 (0:00:01.991) 0:07:08.785 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:15:54 -0400 (0:00:00.435) 0:07:09.221 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:15:54 -0400 (0:00:00.174) 0:07:09.395 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:15:54 -0400 (0:00:00.261) 0:07:09.656 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:15:54 -0400 (0:00:00.160) 0:07:09.817 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:15:54 -0400 (0:00:00.143) 0:07:09.961 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:15:55 -0400 (0:00:00.555) 0:07:10.516 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:15:55 -0400 (0:00:00.242) 0:07:10.759 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:15:56 -0400 (0:00:00.256) 0:07:11.015 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:16:00 -0400 (0:00:04.087) 0:07:15.103 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:16:00 -0400 (0:00:00.172) 0:07:15.276 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:16:00 -0400 (0:00:00.210) 0:07:15.487 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:16:06 -0400 (0:00:05.611) 0:07:21.099 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:16:06 -0400 (0:00:00.357) 0:07:21.457 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:16:06 -0400 (0:00:00.164) 0:07:21.622 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:16:06 -0400 (0:00:00.271) 0:07:21.893 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:16:07 -0400 (0:00:00.168) 0:07:22.061 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:16:11 -0400 (0:00:04.666) 0:07:26.728 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:16:14 -0400 (0:00:03.143) 0:07:29.872 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:16:15 -0400 (0:00:00.276) 0:07:30.148 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:16:20 -0400 (0:00:05.673) 0:07:35.822 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:16:21 -0400 (0:00:00.343) 0:07:36.166 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:16:21 -0400 (0:00:00.282) 0:07:36.449 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:16:22 -0400 (0:00:00.855) 0:07:37.304 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:16:22 -0400 (0:00:00.284) 0:07:37.589 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:219 Friday 17 April 2026 20:16:22 -0400 (0:00:00.193) 0:07:37.782 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:16:23 -0400 (0:00:00.416) 0:07:38.199 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:16:23 -0400 (0:00:00.150) 0:07:38.349 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:16:23 -0400 (0:00:00.363) 0:07:38.713 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:16:25 -0400 (0:00:01.662) 0:07:40.375 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:16:25 -0400 (0:00:00.224) 0:07:40.600 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:16:27 -0400 (0:00:01.692) 0:07:42.292 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:16:27 -0400 (0:00:00.577) 0:07:42.870 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:16:28 -0400 (0:00:00.281) 0:07:43.151 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:16:28 -0400 (0:00:00.257) 0:07:43.408 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:16:28 -0400 (0:00:00.203) 0:07:43.612 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:16:28 -0400 (0:00:00.207) 0:07:43.819 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:16:29 -0400 (0:00:00.495) 0:07:44.315 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:16:29 -0400 (0:00:00.212) 0:07:44.528 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:16:29 -0400 (0:00:00.158) 0:07:44.686 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:16:34 -0400 (0:00:04.707) 0:07:49.394 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:16:34 -0400 (0:00:00.143) 0:07:49.538 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:16:34 -0400 (0:00:00.239) 0:07:49.777 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:16:40 -0400 (0:00:05.302) 0:07:55.080 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:16:40 -0400 (0:00:00.279) 0:07:55.359 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:16:40 -0400 (0:00:00.127) 0:07:55.487 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:16:40 -0400 (0:00:00.173) 0:07:55.660 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:16:40 -0400 (0:00:00.086) 0:07:55.747 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:16:44 -0400 (0:00:03.898) 0:07:59.645 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:16:47 -0400 (0:00:02.757) 0:08:02.402 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:16:47 -0400 (0:00:00.270) 0:08:02.672 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:17:02 -0400 (0:00:14.568) 0:08:17.241 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:17:02 -0400 (0:00:00.309) 0:08:17.551 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471306.671349, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "17b8133d3b19a50667f721ccc10c87f2faa7a576", "ctime": 1776471306.666349, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471306.666349, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:17:04 -0400 (0:00:01.524) 0:08:19.075 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:17:05 -0400 (0:00:01.233) 0:08:20.309 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:17:05 -0400 (0:00:00.380) 0:08:20.689 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:17:05 -0400 (0:00:00.236) 0:08:20.926 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:17:06 -0400 (0:00:00.351) 0:08:21.278 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:17:06 -0400 (0:00:00.282) 0:08:21.561 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4b7196b5-6e15-4946-9038-d82a70956529" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:17:07 -0400 (0:00:01.452) 0:08:23.014 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:17:10 -0400 (0:00:02.057) 0:08:25.071 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:17:11 -0400 (0:00:01.435) 0:08:26.507 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:17:11 -0400 (0:00:00.265) 0:08:26.772 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:17:13 -0400 (0:00:01.495) 0:08:28.268 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471318.4723973, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "91dc605ef1a22a336dae684ec6bacbc30aee7a17", "ctime": 1776471311.318368, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 245366986, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471311.316368, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1992130877", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:17:14 -0400 (0:00:01.489) 0:08:29.757 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda', 'name': 'luks-4b7196b5-6e15-4946-9038-d82a70956529', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4b7196b5-6e15-4946-9038-d82a70956529", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node12] => (item={'backing_device': '/dev/sda1', 'name': 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:17:17 -0400 (0:00:02.589) 0:08:32.347 ********** ok: [managed-node12] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:236 Friday 17 April 2026 20:17:19 -0400 (0:00:01.866) 0:08:34.214 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:17:19 -0400 (0:00:00.626) 0:08:34.841 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:17:20 -0400 (0:00:00.553) 0:08:35.394 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:17:20 -0400 (0:00:00.163) 0:08:35.558 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "size": "4G", "type": "crypt", "uuid": "a513b278-5a06-4daf-8ce9-5fcf03cda6d1" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "814d1f35-ac59-4428-83ec-6f7b65c1154f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:17:21 -0400 (0:00:01.434) 0:08:36.992 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002853", "end": "2026-04-17 20:17:23.068834", "rc": 0, "start": "2026-04-17 20:17:23.065981" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:17:23 -0400 (0:00:01.276) 0:08:38.268 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002432", "end": "2026-04-17 20:17:24.348009", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:17:24.345577" } STDOUT: luks-814d1f35-ac59-4428-83ec-6f7b65c1154f /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:17:24 -0400 (0:00:01.402) 0:08:39.671 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:17:24 -0400 (0:00:00.263) 0:08:39.934 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:17:25 -0400 (0:00:00.138) 0:08:40.073 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:17:25 -0400 (0:00:00.309) 0:08:40.383 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:17:25 -0400 (0:00:00.210) 0:08:40.593 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:17:25 -0400 (0:00:00.394) 0:08:40.988 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:17:26 -0400 (0:00:00.169) 0:08:41.203 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:17:26 -0400 (0:00:00.374) 0:08:41.578 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:17:26 -0400 (0:00:00.312) 0:08:41.890 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:17:27 -0400 (0:00:00.208) 0:08:42.099 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:17:27 -0400 (0:00:00.260) 0:08:42.359 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:17:27 -0400 (0:00:00.251) 0:08:42.610 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:17:27 -0400 (0:00:00.260) 0:08:42.871 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:17:28 -0400 (0:00:00.257) 0:08:43.128 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:17:28 -0400 (0:00:00.300) 0:08:43.428 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:17:30 -0400 (0:00:01.674) 0:08:45.102 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:17:30 -0400 (0:00:00.133) 0:08:45.236 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:17:30 -0400 (0:00:00.445) 0:08:45.681 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:17:30 -0400 (0:00:00.239) 0:08:45.920 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:17:31 -0400 (0:00:00.164) 0:08:46.085 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:17:31 -0400 (0:00:00.218) 0:08:46.303 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:17:31 -0400 (0:00:00.257) 0:08:46.561 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:17:31 -0400 (0:00:00.322) 0:08:46.883 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:17:32 -0400 (0:00:00.326) 0:08:47.210 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:17:32 -0400 (0:00:00.247) 0:08:47.457 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:17:32 -0400 (0:00:00.194) 0:08:47.652 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:17:32 -0400 (0:00:00.218) 0:08:47.871 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:17:33 -0400 (0:00:00.178) 0:08:48.050 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:17:33 -0400 (0:00:00.153) 0:08:48.203 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:17:33 -0400 (0:00:00.532) 0:08:48.735 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:17:33 -0400 (0:00:00.257) 0:08:48.993 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:17:34 -0400 (0:00:00.359) 0:08:49.352 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:17:34 -0400 (0:00:00.220) 0:08:49.573 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:17:34 -0400 (0:00:00.395) 0:08:49.968 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:17:35 -0400 (0:00:00.153) 0:08:50.122 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:17:35 -0400 (0:00:00.229) 0:08:50.352 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:17:35 -0400 (0:00:00.304) 0:08:50.656 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:17:35 -0400 (0:00:00.224) 0:08:50.881 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:17:36 -0400 (0:00:00.541) 0:08:51.422 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:17:36 -0400 (0:00:00.221) 0:08:51.643 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:17:37 -0400 (0:00:00.405) 0:08:52.049 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:17:37 -0400 (0:00:00.206) 0:08:52.256 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:17:37 -0400 (0:00:00.317) 0:08:52.574 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:17:37 -0400 (0:00:00.217) 0:08:52.791 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:17:38 -0400 (0:00:00.238) 0:08:53.030 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:17:38 -0400 (0:00:00.232) 0:08:53.263 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:17:38 -0400 (0:00:00.205) 0:08:53.468 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:17:38 -0400 (0:00:00.241) 0:08:53.710 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:17:38 -0400 (0:00:00.202) 0:08:53.912 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:17:39 -0400 (0:00:00.400) 0:08:54.313 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:17:39 -0400 (0:00:00.271) 0:08:54.584 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:17:40 -0400 (0:00:01.396) 0:08:55.980 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:17:41 -0400 (0:00:00.217) 0:08:56.198 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:17:41 -0400 (0:00:00.252) 0:08:56.450 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:17:41 -0400 (0:00:00.331) 0:08:56.781 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:17:42 -0400 (0:00:00.749) 0:08:57.531 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:17:42 -0400 (0:00:00.227) 0:08:57.759 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:17:43 -0400 (0:00:00.289) 0:08:58.048 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:17:43 -0400 (0:00:00.223) 0:08:58.272 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:17:43 -0400 (0:00:00.248) 0:08:58.520 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:17:43 -0400 (0:00:00.298) 0:08:58.819 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:17:44 -0400 (0:00:00.295) 0:08:59.114 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:17:44 -0400 (0:00:00.157) 0:08:59.272 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:17:44 -0400 (0:00:00.407) 0:08:59.679 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:17:44 -0400 (0:00:00.263) 0:08:59.942 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:17:45 -0400 (0:00:00.307) 0:09:00.250 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:17:45 -0400 (0:00:00.298) 0:09:00.548 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:17:45 -0400 (0:00:00.188) 0:09:00.737 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:17:45 -0400 (0:00:00.256) 0:09:00.994 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:17:46 -0400 (0:00:00.341) 0:09:01.336 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:17:46 -0400 (0:00:00.250) 0:09:01.587 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471421.7108195, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471421.7108195, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 231465, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471421.7108195, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:17:48 -0400 (0:00:01.901) 0:09:03.488 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:17:48 -0400 (0:00:00.375) 0:09:03.864 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:17:49 -0400 (0:00:00.286) 0:09:04.150 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:17:49 -0400 (0:00:00.291) 0:09:04.442 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:17:49 -0400 (0:00:00.240) 0:09:04.682 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:17:49 -0400 (0:00:00.244) 0:09:04.927 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:17:50 -0400 (0:00:00.329) 0:09:05.256 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471421.86982, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471421.86982, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 230893, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471421.86982, "nlink": 1, "path": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:17:51 -0400 (0:00:01.465) 0:09:06.721 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:17:55 -0400 (0:00:04.284) 0:09:11.005 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010353", "end": "2026-04-17 20:17:57.535235", "rc": 0, "start": "2026-04-17 20:17:57.524882" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 814d1f35-ac59-4428-83ec-6f7b65c1154f Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 906611 Threads: 2 Salt: 66 c1 6b 3f 8a 2a 2e 1c 6e c9 ab 81 e3 a0 61 b4 00 1c 86 77 51 4a 5f cd 5b e5 fb 99 9c 43 70 e7 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: e3 0e 85 61 e7 4b 91 9f be 84 6f 11 e5 f0 fd 51 ee 9d f0 d2 3a 69 f6 09 3d ef 9e 5c fd a6 2c 61 Digest: bd 9b 11 d6 e2 79 46 76 0c ae 4d f5 40 0f 76 e7 e5 54 ec 22 43 c2 50 b1 08 42 d8 21 f8 71 0e 34 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:17:57 -0400 (0:00:01.818) 0:09:12.824 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:17:58 -0400 (0:00:00.259) 0:09:13.083 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:17:58 -0400 (0:00:00.362) 0:09:13.445 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:17:58 -0400 (0:00:00.397) 0:09:13.843 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:17:59 -0400 (0:00:00.319) 0:09:14.163 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:17:59 -0400 (0:00:00.445) 0:09:14.608 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:17:59 -0400 (0:00:00.364) 0:09:14.973 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:18:00 -0400 (0:00:00.241) 0:09:15.215 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:18:00 -0400 (0:00:00.357) 0:09:15.572 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:18:00 -0400 (0:00:00.292) 0:09:15.865 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:18:01 -0400 (0:00:00.359) 0:09:16.225 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:18:01 -0400 (0:00:00.292) 0:09:16.517 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:18:02 -0400 (0:00:00.546) 0:09:17.063 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:18:02 -0400 (0:00:00.299) 0:09:17.363 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:18:02 -0400 (0:00:00.304) 0:09:17.668 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:18:02 -0400 (0:00:00.233) 0:09:17.901 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:18:03 -0400 (0:00:00.200) 0:09:18.101 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:18:03 -0400 (0:00:00.361) 0:09:18.463 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:18:03 -0400 (0:00:00.338) 0:09:18.801 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:18:04 -0400 (0:00:00.298) 0:09:19.100 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:18:04 -0400 (0:00:00.159) 0:09:19.259 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:18:04 -0400 (0:00:00.263) 0:09:19.522 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:18:04 -0400 (0:00:00.195) 0:09:19.718 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:18:04 -0400 (0:00:00.231) 0:09:19.949 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:18:05 -0400 (0:00:00.265) 0:09:20.215 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:18:05 -0400 (0:00:00.314) 0:09:20.529 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:18:05 -0400 (0:00:00.263) 0:09:20.793 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:18:06 -0400 (0:00:00.282) 0:09:21.076 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:18:06 -0400 (0:00:00.186) 0:09:21.262 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:18:06 -0400 (0:00:00.299) 0:09:21.561 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:18:06 -0400 (0:00:00.248) 0:09:21.810 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:18:07 -0400 (0:00:00.300) 0:09:22.110 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:18:07 -0400 (0:00:00.280) 0:09:22.391 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:18:07 -0400 (0:00:00.268) 0:09:22.660 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:18:07 -0400 (0:00:00.213) 0:09:22.873 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:18:08 -0400 (0:00:00.273) 0:09:23.146 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:18:08 -0400 (0:00:00.265) 0:09:23.412 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:18:08 -0400 (0:00:00.372) 0:09:23.784 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:18:09 -0400 (0:00:00.281) 0:09:24.067 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:18:09 -0400 (0:00:00.237) 0:09:24.304 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:18:09 -0400 (0:00:00.194) 0:09:24.498 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:18:09 -0400 (0:00:00.293) 0:09:24.792 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:18:10 -0400 (0:00:00.251) 0:09:25.043 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:18:10 -0400 (0:00:00.338) 0:09:25.382 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:18:10 -0400 (0:00:00.291) 0:09:25.674 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:18:11 -0400 (0:00:00.356) 0:09:26.030 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:18:11 -0400 (0:00:00.321) 0:09:26.352 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:18:11 -0400 (0:00:00.269) 0:09:26.621 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:18:11 -0400 (0:00:00.272) 0:09:26.894 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:18:12 -0400 (0:00:00.233) 0:09:27.128 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:18:12 -0400 (0:00:00.167) 0:09:27.296 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:18:12 -0400 (0:00:00.232) 0:09:27.528 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:18:12 -0400 (0:00:00.164) 0:09:27.693 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:18:12 -0400 (0:00:00.238) 0:09:27.931 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:18:13 -0400 (0:00:00.238) 0:09:28.170 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:18:13 -0400 (0:00:00.332) 0:09:28.502 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:18:13 -0400 (0:00:00.270) 0:09:28.773 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:18:14 -0400 (0:00:00.282) 0:09:29.055 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:18:14 -0400 (0:00:00.247) 0:09:29.303 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:18:14 -0400 (0:00:00.243) 0:09:29.547 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:18:14 -0400 (0:00:00.339) 0:09:29.887 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:242 Friday 17 April 2026 20:18:16 -0400 (0:00:01.300) 0:09:31.187 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:18:16 -0400 (0:00:00.561) 0:09:31.749 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:18:16 -0400 (0:00:00.183) 0:09:31.933 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:18:17 -0400 (0:00:00.270) 0:09:32.203 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:18:17 -0400 (0:00:00.188) 0:09:32.391 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:18:17 -0400 (0:00:00.224) 0:09:32.616 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:18:19 -0400 (0:00:01.583) 0:09:34.200 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:18:19 -0400 (0:00:00.191) 0:09:34.392 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:18:21 -0400 (0:00:02.046) 0:09:36.439 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:18:21 -0400 (0:00:00.494) 0:09:36.934 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:18:22 -0400 (0:00:00.271) 0:09:37.205 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:18:22 -0400 (0:00:00.280) 0:09:37.486 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:18:22 -0400 (0:00:00.136) 0:09:37.623 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:18:22 -0400 (0:00:00.190) 0:09:37.813 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:18:23 -0400 (0:00:00.508) 0:09:38.321 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:18:23 -0400 (0:00:00.267) 0:09:38.589 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:18:23 -0400 (0:00:00.233) 0:09:38.822 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:18:28 -0400 (0:00:05.082) 0:09:43.905 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:18:29 -0400 (0:00:00.277) 0:09:44.183 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:18:29 -0400 (0:00:00.238) 0:09:44.421 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:18:34 -0400 (0:00:05.522) 0:09:49.944 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:18:35 -0400 (0:00:00.276) 0:09:50.221 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:18:35 -0400 (0:00:00.204) 0:09:50.425 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:18:35 -0400 (0:00:00.165) 0:09:50.591 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:18:35 -0400 (0:00:00.138) 0:09:50.729 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:18:40 -0400 (0:00:05.071) 0:09:55.801 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service": { "name": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service": { "name": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:18:43 -0400 (0:00:02.847) 0:09:58.649 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d4b7196b5\x2d6e15\x2d4946\x2d9038\x2dd82a70956529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "name": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-4b7196b5-6e15-4946-9038-d82a70956529", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4b7196b5-6e15-4946-9038-d82a70956529 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4b7196b5-6e15-4946-9038-d82a70956529 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:17:13 EDT", "StateChangeTimestampMonotonic": "2068033222", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...d6e15\x2d4946\x2d9038\x2dd82a70956529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "name": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:18:47 -0400 (0:00:03.722) 0:10:02.371 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:18:52 -0400 (0:00:04.937) 0:10:07.308 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:18:52 -0400 (0:00:00.179) 0:10:07.487 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d4b7196b5\x2d6e15\x2d4946\x2d9038\x2dd82a70956529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "name": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d4b7196b5\\x2d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...d6e15\x2d4946\x2d9038\x2dd82a70956529.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "name": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6e15\\x2d4946\\x2d9038\\x2dd82a70956529.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:18:55 -0400 (0:00:02.703) 0:10:10.191 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:18:55 -0400 (0:00:00.148) 0:10:10.339 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:18:55 -0400 (0:00:00.267) 0:10:10.607 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:18:55 -0400 (0:00:00.139) 0:10:10.746 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471495.989123, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471495.989123, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471495.989123, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1691714166", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:18:57 -0400 (0:00:01.719) 0:10:12.466 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:267 Friday 17 April 2026 20:18:57 -0400 (0:00:00.241) 0:10:12.707 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:18:58 -0400 (0:00:00.521) 0:10:13.228 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:18:58 -0400 (0:00:00.257) 0:10:13.485 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:18:58 -0400 (0:00:00.223) 0:10:13.709 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:19:00 -0400 (0:00:01.365) 0:10:15.075 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:19:00 -0400 (0:00:00.143) 0:10:15.218 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:19:01 -0400 (0:00:01.709) 0:10:16.927 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:19:02 -0400 (0:00:00.474) 0:10:17.402 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:19:02 -0400 (0:00:00.265) 0:10:17.667 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:19:02 -0400 (0:00:00.240) 0:10:17.908 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:19:03 -0400 (0:00:00.179) 0:10:18.087 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:19:03 -0400 (0:00:00.143) 0:10:18.230 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:19:03 -0400 (0:00:00.504) 0:10:18.735 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:19:03 -0400 (0:00:00.235) 0:10:18.971 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:19:04 -0400 (0:00:00.201) 0:10:19.173 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:19:08 -0400 (0:00:03.965) 0:10:23.138 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:19:08 -0400 (0:00:00.154) 0:10:23.293 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:19:08 -0400 (0:00:00.167) 0:10:23.461 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:19:13 -0400 (0:00:05.183) 0:10:28.645 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:19:13 -0400 (0:00:00.238) 0:10:28.883 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:19:14 -0400 (0:00:00.142) 0:10:29.025 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:19:14 -0400 (0:00:00.237) 0:10:29.263 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:19:14 -0400 (0:00:00.118) 0:10:29.382 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:19:18 -0400 (0:00:04.103) 0:10:33.486 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service": { "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service": { "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:19:21 -0400 (0:00:03.180) 0:10:36.666 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d814d1f35\x2dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-sda1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-814d1f35-ac59-4428-83ec-6f7b65c1154f /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-814d1f35-ac59-4428-83ec-6f7b65c1154f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:18:47 EDT", "StateChangeTimestampMonotonic": "2162140297", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:19:25 -0400 (0:00:03.737) 0:10:40.404 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:19:31 -0400 (0:00:05.949) 0:10:46.353 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:19:31 -0400 (0:00:00.173) 0:10:46.526 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471431.2938588, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7971c50c13fae2967775a604c909529ceea38f7b", "ctime": 1776471431.2908587, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471431.2908587, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:19:32 -0400 (0:00:01.142) 0:10:47.669 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:19:33 -0400 (0:00:01.163) 0:10:48.833 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d814d1f35\x2dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:18:47 EDT", "StateChangeTimestampMonotonic": "2162140297", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:19:36 -0400 (0:00:03.112) 0:10:51.946 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:19:37 -0400 (0:00:00.265) 0:10:52.211 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:19:37 -0400 (0:00:00.246) 0:10:52.457 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:19:37 -0400 (0:00:00.237) 0:10:52.695 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-814d1f35-ac59-4428-83ec-6f7b65c1154f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:19:39 -0400 (0:00:01.544) 0:10:54.240 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:19:40 -0400 (0:00:01.661) 0:10:55.902 ********** changed: [managed-node12] => (item={'src': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:19:42 -0400 (0:00:01.472) 0:10:57.374 ********** skipping: [managed-node12] => (item={'src': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:19:42 -0400 (0:00:00.252) 0:10:57.627 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:19:44 -0400 (0:00:01.825) 0:10:59.452 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471444.346912, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e9b1183a9ba8df85413ab03a45f0214dea0c7fb2", "ctime": 1776471437.1588826, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 377487493, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471437.1588826, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "854482478", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:19:45 -0400 (0:00:01.258) 0:11:00.711 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda1', 'name': 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:19:47 -0400 (0:00:01.505) 0:11:02.216 ********** ok: [managed-node12] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:284 Friday 17 April 2026 20:19:49 -0400 (0:00:01.896) 0:11:04.113 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:19:49 -0400 (0:00:00.463) 0:11:04.576 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:19:49 -0400 (0:00:00.196) 0:11:04.773 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:19:49 -0400 (0:00:00.193) 0:11:04.966 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "9c3b0166-74e1-4668-ac43-dd9ddb47d33a" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:19:51 -0400 (0:00:01.174) 0:11:06.141 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003014", "end": "2026-04-17 20:19:52.181759", "rc": 0, "start": "2026-04-17 20:19:52.178745" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:19:52 -0400 (0:00:01.228) 0:11:07.369 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003162", "end": "2026-04-17 20:19:53.622938", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:19:53.619776" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:19:53 -0400 (0:00:01.548) 0:11:08.918 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:19:54 -0400 (0:00:00.347) 0:11:09.266 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:19:54 -0400 (0:00:00.250) 0:11:09.517 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:19:54 -0400 (0:00:00.292) 0:11:09.809 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:19:55 -0400 (0:00:00.233) 0:11:10.043 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:19:55 -0400 (0:00:00.434) 0:11:10.477 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:19:55 -0400 (0:00:00.192) 0:11:10.670 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:19:55 -0400 (0:00:00.274) 0:11:10.944 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:19:56 -0400 (0:00:00.265) 0:11:11.210 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:19:56 -0400 (0:00:00.300) 0:11:11.510 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:19:56 -0400 (0:00:00.272) 0:11:11.782 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:19:57 -0400 (0:00:00.268) 0:11:12.051 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:19:57 -0400 (0:00:00.286) 0:11:12.337 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:19:57 -0400 (0:00:00.246) 0:11:12.584 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:19:57 -0400 (0:00:00.184) 0:11:12.768 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:19:59 -0400 (0:00:01.797) 0:11:14.566 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:19:59 -0400 (0:00:00.175) 0:11:14.742 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:20:00 -0400 (0:00:00.539) 0:11:15.281 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:20:00 -0400 (0:00:00.240) 0:11:15.522 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:20:00 -0400 (0:00:00.180) 0:11:15.702 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:20:00 -0400 (0:00:00.182) 0:11:15.884 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:20:01 -0400 (0:00:00.261) 0:11:16.146 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:20:01 -0400 (0:00:00.257) 0:11:16.404 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:20:01 -0400 (0:00:00.246) 0:11:16.651 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:20:01 -0400 (0:00:00.274) 0:11:16.925 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:20:02 -0400 (0:00:00.263) 0:11:17.188 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:20:02 -0400 (0:00:00.264) 0:11:17.452 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:20:02 -0400 (0:00:00.235) 0:11:17.688 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:20:02 -0400 (0:00:00.176) 0:11:17.864 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:20:03 -0400 (0:00:00.507) 0:11:18.372 ********** skipping: [managed-node12] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:20:03 -0400 (0:00:00.277) 0:11:18.649 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:20:04 -0400 (0:00:00.484) 0:11:19.133 ********** skipping: [managed-node12] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:20:04 -0400 (0:00:00.276) 0:11:19.409 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:20:04 -0400 (0:00:00.576) 0:11:19.986 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:20:05 -0400 (0:00:00.204) 0:11:20.191 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:20:05 -0400 (0:00:00.118) 0:11:20.309 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:20:05 -0400 (0:00:00.238) 0:11:20.548 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:20:05 -0400 (0:00:00.185) 0:11:20.734 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:20:06 -0400 (0:00:00.519) 0:11:21.254 ********** skipping: [managed-node12] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:20:06 -0400 (0:00:00.275) 0:11:21.529 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:20:06 -0400 (0:00:00.442) 0:11:21.972 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:20:07 -0400 (0:00:00.242) 0:11:22.214 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:20:07 -0400 (0:00:00.236) 0:11:22.451 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:20:07 -0400 (0:00:00.276) 0:11:22.727 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:20:07 -0400 (0:00:00.256) 0:11:22.984 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:20:08 -0400 (0:00:00.210) 0:11:23.195 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:20:09 -0400 (0:00:00.860) 0:11:24.055 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:20:09 -0400 (0:00:00.096) 0:11:24.152 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:20:09 -0400 (0:00:00.175) 0:11:24.328 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:20:09 -0400 (0:00:00.303) 0:11:24.632 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:20:09 -0400 (0:00:00.119) 0:11:24.752 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:20:10 -0400 (0:00:00.871) 0:11:25.623 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:20:11 -0400 (0:00:00.522) 0:11:26.145 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:20:11 -0400 (0:00:00.324) 0:11:26.470 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:20:11 -0400 (0:00:00.327) 0:11:26.798 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:20:12 -0400 (0:00:00.331) 0:11:27.129 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:20:12 -0400 (0:00:00.378) 0:11:27.508 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:20:12 -0400 (0:00:00.359) 0:11:27.868 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:20:13 -0400 (0:00:00.250) 0:11:28.118 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:20:13 -0400 (0:00:00.253) 0:11:28.372 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:20:13 -0400 (0:00:00.244) 0:11:28.617 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:20:13 -0400 (0:00:00.156) 0:11:28.773 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:20:13 -0400 (0:00:00.109) 0:11:28.883 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:20:14 -0400 (0:00:00.544) 0:11:29.428 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:20:14 -0400 (0:00:00.256) 0:11:29.684 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:20:14 -0400 (0:00:00.204) 0:11:29.889 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:20:15 -0400 (0:00:00.258) 0:11:30.148 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:20:15 -0400 (0:00:00.328) 0:11:30.476 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:20:15 -0400 (0:00:00.170) 0:11:30.647 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:20:15 -0400 (0:00:00.365) 0:11:31.012 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:20:16 -0400 (0:00:00.378) 0:11:31.390 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471571.1314309, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471571.1314309, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 231465, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471571.1314309, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:20:17 -0400 (0:00:01.547) 0:11:32.938 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:20:18 -0400 (0:00:00.277) 0:11:33.215 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:20:18 -0400 (0:00:00.328) 0:11:33.543 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:20:18 -0400 (0:00:00.287) 0:11:33.830 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:20:19 -0400 (0:00:00.340) 0:11:34.171 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:20:19 -0400 (0:00:00.267) 0:11:34.439 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:20:19 -0400 (0:00:00.242) 0:11:34.682 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:20:19 -0400 (0:00:00.278) 0:11:34.960 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:20:25 -0400 (0:00:05.105) 0:11:40.065 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:20:25 -0400 (0:00:00.280) 0:11:40.346 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:20:25 -0400 (0:00:00.315) 0:11:40.662 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:20:26 -0400 (0:00:00.401) 0:11:41.063 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:20:26 -0400 (0:00:00.315) 0:11:41.379 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:20:26 -0400 (0:00:00.196) 0:11:41.576 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:20:26 -0400 (0:00:00.221) 0:11:41.797 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:20:26 -0400 (0:00:00.169) 0:11:41.967 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:20:27 -0400 (0:00:00.261) 0:11:42.228 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:20:27 -0400 (0:00:00.342) 0:11:42.571 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:20:27 -0400 (0:00:00.147) 0:11:42.718 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:20:28 -0400 (0:00:00.343) 0:11:43.062 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:20:28 -0400 (0:00:00.204) 0:11:43.266 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:20:28 -0400 (0:00:00.338) 0:11:43.605 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:20:28 -0400 (0:00:00.197) 0:11:43.803 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:20:29 -0400 (0:00:00.243) 0:11:44.046 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:20:29 -0400 (0:00:00.264) 0:11:44.311 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:20:29 -0400 (0:00:00.256) 0:11:44.568 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:20:29 -0400 (0:00:00.201) 0:11:44.769 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:20:30 -0400 (0:00:00.263) 0:11:45.033 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:20:30 -0400 (0:00:00.233) 0:11:45.267 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:20:30 -0400 (0:00:00.194) 0:11:45.461 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:20:30 -0400 (0:00:00.236) 0:11:45.697 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:20:30 -0400 (0:00:00.276) 0:11:45.973 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:20:31 -0400 (0:00:00.267) 0:11:46.241 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:20:31 -0400 (0:00:00.359) 0:11:46.600 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:20:31 -0400 (0:00:00.297) 0:11:46.898 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:20:32 -0400 (0:00:00.209) 0:11:47.107 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:20:32 -0400 (0:00:00.359) 0:11:47.467 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:20:32 -0400 (0:00:00.392) 0:11:47.859 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:20:33 -0400 (0:00:00.341) 0:11:48.200 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:20:33 -0400 (0:00:00.243) 0:11:48.444 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:20:33 -0400 (0:00:00.302) 0:11:48.747 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:20:34 -0400 (0:00:00.290) 0:11:49.037 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:20:34 -0400 (0:00:00.333) 0:11:49.371 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:20:34 -0400 (0:00:00.278) 0:11:49.650 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:20:34 -0400 (0:00:00.293) 0:11:49.943 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:20:35 -0400 (0:00:00.289) 0:11:50.233 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:20:35 -0400 (0:00:00.297) 0:11:50.531 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:20:35 -0400 (0:00:00.244) 0:11:50.775 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:20:36 -0400 (0:00:00.318) 0:11:51.093 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:20:36 -0400 (0:00:00.269) 0:11:51.363 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:20:36 -0400 (0:00:00.359) 0:11:51.723 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:20:37 -0400 (0:00:00.318) 0:11:52.041 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:20:37 -0400 (0:00:00.366) 0:11:52.407 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:20:37 -0400 (0:00:00.331) 0:11:52.739 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:20:38 -0400 (0:00:00.301) 0:11:53.041 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:20:38 -0400 (0:00:00.275) 0:11:53.317 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:20:38 -0400 (0:00:00.298) 0:11:53.615 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:20:38 -0400 (0:00:00.245) 0:11:53.861 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:20:39 -0400 (0:00:00.206) 0:11:54.068 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:20:39 -0400 (0:00:00.334) 0:11:54.402 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:20:39 -0400 (0:00:00.327) 0:11:54.730 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:20:39 -0400 (0:00:00.228) 0:11:54.959 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:20:40 -0400 (0:00:00.295) 0:11:55.254 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:20:40 -0400 (0:00:00.274) 0:11:55.529 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:20:40 -0400 (0:00:00.281) 0:11:55.810 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:20:41 -0400 (0:00:00.401) 0:11:56.212 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:20:41 -0400 (0:00:00.400) 0:11:56.612 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:20:41 -0400 (0:00:00.296) 0:11:56.908 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:20:42 -0400 (0:00:00.218) 0:11:57.126 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:20:42 -0400 (0:00:00.280) 0:11:57.407 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Friday 17 April 2026 20:20:43 -0400 (0:00:01.525) 0:11:58.932 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:20:44 -0400 (0:00:00.548) 0:11:59.480 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:20:44 -0400 (0:00:00.422) 0:11:59.903 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:20:45 -0400 (0:00:00.235) 0:12:00.138 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:20:45 -0400 (0:00:00.156) 0:12:00.294 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:20:45 -0400 (0:00:00.244) 0:12:00.539 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:20:47 -0400 (0:00:01.755) 0:12:02.295 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:20:47 -0400 (0:00:00.229) 0:12:02.524 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:20:49 -0400 (0:00:02.082) 0:12:04.606 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:20:50 -0400 (0:00:00.545) 0:12:05.152 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:20:50 -0400 (0:00:00.292) 0:12:05.444 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:20:50 -0400 (0:00:00.239) 0:12:05.684 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:20:50 -0400 (0:00:00.123) 0:12:05.807 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:20:51 -0400 (0:00:00.236) 0:12:06.044 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:20:51 -0400 (0:00:00.620) 0:12:06.664 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:20:51 -0400 (0:00:00.227) 0:12:06.892 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:20:53 -0400 (0:00:01.273) 0:12:08.166 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:20:57 -0400 (0:00:04.120) 0:12:12.287 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:20:57 -0400 (0:00:00.241) 0:12:12.529 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:20:57 -0400 (0:00:00.170) 0:12:12.699 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:21:03 -0400 (0:00:05.501) 0:12:18.201 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:21:03 -0400 (0:00:00.404) 0:12:18.605 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:21:03 -0400 (0:00:00.106) 0:12:18.712 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:21:03 -0400 (0:00:00.165) 0:12:18.877 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:21:04 -0400 (0:00:00.204) 0:12:19.082 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:21:08 -0400 (0:00:04.123) 0:12:23.205 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service": { "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service": { "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:21:11 -0400 (0:00:03.010) 0:12:26.215 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d814d1f35\x2dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-814d1f35-ac59-4428-83ec-6f7b65c1154f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-814d1f35-ac59-4428-83ec-6f7b65c1154f /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-814d1f35-ac59-4428-83ec-6f7b65c1154f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:18:47 EDT", "StateChangeTimestampMonotonic": "2162140297", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:21:14 -0400 (0:00:02.941) 0:12:29.157 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:21:19 -0400 (0:00:05.429) 0:12:34.586 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:21:19 -0400 (0:00:00.186) 0:12:34.772 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d814d1f35\x2dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d814d1f35\\x2dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dac59\x2d4428\x2d83ec\x2d6f7b65c1154f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "name": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac59\\x2d4428\\x2d83ec\\x2d6f7b65c1154f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:21:23 -0400 (0:00:03.515) 0:12:38.288 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:21:23 -0400 (0:00:00.293) 0:12:38.582 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:21:23 -0400 (0:00:00.342) 0:12:38.924 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:21:24 -0400 (0:00:00.239) 0:12:39.164 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471643.688728, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471643.688728, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776471643.688728, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "903290266", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:21:25 -0400 (0:00:01.483) 0:12:40.647 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:317 Friday 17 April 2026 20:21:25 -0400 (0:00:00.235) 0:12:40.883 ********** ok: [managed-node12] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testtvp92ntnlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:324 Friday 17 April 2026 20:21:28 -0400 (0:00:02.954) 0:12:43.838 ********** ok: [managed-node12] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testtvp92ntnlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776471689.140128-201778-85818435312135/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:331 Friday 17 April 2026 20:21:33 -0400 (0:00:04.235) 0:12:48.074 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:21:33 -0400 (0:00:00.247) 0:12:48.321 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:21:33 -0400 (0:00:00.199) 0:12:48.520 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:21:33 -0400 (0:00:00.135) 0:12:48.656 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:21:35 -0400 (0:00:01.688) 0:12:50.345 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:21:35 -0400 (0:00:00.226) 0:12:50.571 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:21:37 -0400 (0:00:01.553) 0:12:52.125 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:21:37 -0400 (0:00:00.286) 0:12:52.412 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:21:37 -0400 (0:00:00.167) 0:12:52.579 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:21:37 -0400 (0:00:00.148) 0:12:52.728 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:21:37 -0400 (0:00:00.114) 0:12:52.842 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:21:37 -0400 (0:00:00.111) 0:12:52.954 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:21:38 -0400 (0:00:00.390) 0:12:53.344 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:21:38 -0400 (0:00:00.266) 0:12:53.611 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:21:38 -0400 (0:00:00.169) 0:12:53.781 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:21:42 -0400 (0:00:03.709) 0:12:57.490 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:21:42 -0400 (0:00:00.269) 0:12:57.759 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:21:42 -0400 (0:00:00.252) 0:12:58.011 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:21:48 -0400 (0:00:05.435) 0:13:03.447 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:21:48 -0400 (0:00:00.514) 0:13:03.961 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:21:49 -0400 (0:00:00.228) 0:13:04.190 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:21:49 -0400 (0:00:00.296) 0:13:04.486 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:21:49 -0400 (0:00:00.159) 0:13:04.645 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:21:54 -0400 (0:00:04.623) 0:13:09.269 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:21:57 -0400 (0:00:03.309) 0:13:12.579 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:21:58 -0400 (0:00:00.566) 0:13:13.145 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "/tmp/storage_testtvp92ntnlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:22:12 -0400 (0:00:13.990) 0:13:27.136 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:22:12 -0400 (0:00:00.238) 0:13:27.374 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471582.0464756, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "aeafd9fa7aa5e66371ca43a3d9d5d8bcb9c29b32", "ctime": 1776471582.0424757, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471582.0424757, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:22:13 -0400 (0:00:01.416) 0:13:28.790 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:22:15 -0400 (0:00:01.643) 0:13:30.433 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:22:15 -0400 (0:00:00.318) 0:13:30.752 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "/tmp/storage_testtvp92ntnlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:22:15 -0400 (0:00:00.224) 0:13:30.977 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:22:16 -0400 (0:00:00.254) 0:13:31.231 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:22:16 -0400 (0:00:00.226) 0:13:31.458 ********** changed: [managed-node12] => (item={'src': 'UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=9c3b0166-74e1-4668-ac43-dd9ddb47d33a" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:22:18 -0400 (0:00:01.772) 0:13:33.230 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:22:20 -0400 (0:00:02.096) 0:13:35.327 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:22:22 -0400 (0:00:01.752) 0:13:37.079 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:22:22 -0400 (0:00:00.308) 0:13:37.388 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:22:24 -0400 (0:00:01.750) 0:13:39.139 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471593.6215231, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776471587.0014958, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 23068918, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776471587.000496, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3607127744", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:22:25 -0400 (0:00:01.481) 0:13:40.620 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda1', 'name': 'luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', 'password': '/tmp/storage_testtvp92ntnlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "/tmp/storage_testtvp92ntnlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:22:27 -0400 (0:00:01.614) 0:13:42.235 ********** ok: [managed-node12] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:348 Friday 17 April 2026 20:22:28 -0400 (0:00:01.666) 0:13:43.901 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:22:29 -0400 (0:00:00.224) 0:13:44.125 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:22:29 -0400 (0:00:00.300) 0:13:44.426 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:22:30 -0400 (0:00:00.795) 0:13:45.222 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "size": "4G", "type": "crypt", "uuid": "06a446ee-c806-4c14-80c6-3010c5f1f015" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "bbf9ee95-c86e-4f81-a243-062ccbb378c9" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:22:31 -0400 (0:00:01.262) 0:13:46.485 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002775", "end": "2026-04-17 20:22:32.600515", "rc": 0, "start": "2026-04-17 20:22:32.597740" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:22:32 -0400 (0:00:01.364) 0:13:47.849 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003436", "end": "2026-04-17 20:22:33.791993", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:22:33.788557" } STDOUT: luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 /dev/sda1 /tmp/storage_testtvp92ntnlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:22:34 -0400 (0:00:01.205) 0:13:49.055 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:22:34 -0400 (0:00:00.541) 0:13:49.596 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:22:34 -0400 (0:00:00.129) 0:13:49.725 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:22:34 -0400 (0:00:00.255) 0:13:49.980 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:22:35 -0400 (0:00:00.291) 0:13:50.272 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:22:35 -0400 (0:00:00.584) 0:13:50.857 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:22:36 -0400 (0:00:00.250) 0:13:51.108 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:22:36 -0400 (0:00:00.208) 0:13:51.316 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:22:36 -0400 (0:00:00.233) 0:13:51.550 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:22:36 -0400 (0:00:00.220) 0:13:51.771 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:22:36 -0400 (0:00:00.213) 0:13:51.984 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:22:37 -0400 (0:00:00.311) 0:13:52.296 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:22:37 -0400 (0:00:00.234) 0:13:52.530 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:22:37 -0400 (0:00:00.169) 0:13:52.699 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:22:37 -0400 (0:00:00.150) 0:13:52.850 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:22:39 -0400 (0:00:01.343) 0:13:54.193 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:22:39 -0400 (0:00:00.204) 0:13:54.398 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:22:39 -0400 (0:00:00.289) 0:13:54.688 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:22:39 -0400 (0:00:00.187) 0:13:54.875 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:22:40 -0400 (0:00:00.203) 0:13:55.078 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:22:40 -0400 (0:00:00.192) 0:13:55.271 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:22:40 -0400 (0:00:00.165) 0:13:55.437 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:22:40 -0400 (0:00:00.162) 0:13:55.600 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:22:40 -0400 (0:00:00.260) 0:13:55.861 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:22:41 -0400 (0:00:00.344) 0:13:56.206 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:22:41 -0400 (0:00:00.094) 0:13:56.300 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:22:41 -0400 (0:00:00.193) 0:13:56.494 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:22:41 -0400 (0:00:00.166) 0:13:56.660 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:22:41 -0400 (0:00:00.185) 0:13:56.846 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:22:42 -0400 (0:00:00.504) 0:13:57.351 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvp92ntnlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:22:42 -0400 (0:00:00.219) 0:13:57.570 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:22:42 -0400 (0:00:00.424) 0:13:57.994 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvp92ntnlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:22:43 -0400 (0:00:00.175) 0:13:58.170 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:22:43 -0400 (0:00:00.414) 0:13:58.585 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:22:43 -0400 (0:00:00.123) 0:13:58.708 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:22:43 -0400 (0:00:00.072) 0:13:58.780 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:22:43 -0400 (0:00:00.091) 0:13:58.872 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:22:43 -0400 (0:00:00.061) 0:13:58.933 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:22:44 -0400 (0:00:00.295) 0:13:59.229 ********** skipping: [managed-node12] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvp92ntnlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvp92ntnlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:22:44 -0400 (0:00:00.288) 0:13:59.517 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:22:44 -0400 (0:00:00.419) 0:13:59.937 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:22:45 -0400 (0:00:00.174) 0:14:00.112 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:22:45 -0400 (0:00:00.257) 0:14:00.369 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:22:45 -0400 (0:00:00.261) 0:14:00.631 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:22:45 -0400 (0:00:00.177) 0:14:00.808 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:22:45 -0400 (0:00:00.201) 0:14:01.010 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:22:46 -0400 (0:00:00.261) 0:14:01.271 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:22:46 -0400 (0:00:00.200) 0:14:01.472 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:22:46 -0400 (0:00:00.104) 0:14:01.576 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:22:46 -0400 (0:00:00.168) 0:14:01.745 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:22:46 -0400 (0:00:00.216) 0:14:01.961 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:22:48 -0400 (0:00:01.068) 0:14:03.030 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:22:48 -0400 (0:00:00.295) 0:14:03.325 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:22:48 -0400 (0:00:00.259) 0:14:03.585 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:22:48 -0400 (0:00:00.231) 0:14:03.817 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:22:48 -0400 (0:00:00.107) 0:14:03.924 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:22:49 -0400 (0:00:00.153) 0:14:04.078 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:22:49 -0400 (0:00:00.209) 0:14:04.287 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:22:49 -0400 (0:00:00.284) 0:14:04.572 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:22:49 -0400 (0:00:00.235) 0:14:04.807 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:22:50 -0400 (0:00:00.280) 0:14:05.087 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:22:50 -0400 (0:00:00.253) 0:14:05.341 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:22:50 -0400 (0:00:00.267) 0:14:05.608 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:22:51 -0400 (0:00:00.487) 0:14:06.095 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:22:51 -0400 (0:00:00.176) 0:14:06.272 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:22:51 -0400 (0:00:00.185) 0:14:06.458 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:22:51 -0400 (0:00:00.329) 0:14:06.788 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:22:51 -0400 (0:00:00.199) 0:14:06.988 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:22:52 -0400 (0:00:00.096) 0:14:07.084 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:22:52 -0400 (0:00:00.321) 0:14:07.405 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:22:52 -0400 (0:00:00.370) 0:14:07.775 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471731.6480887, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471731.6480887, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266494, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776471731.6480887, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:22:54 -0400 (0:00:01.552) 0:14:09.328 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:22:54 -0400 (0:00:00.194) 0:14:09.522 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:22:54 -0400 (0:00:00.199) 0:14:09.722 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:22:54 -0400 (0:00:00.230) 0:14:09.952 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:22:55 -0400 (0:00:00.416) 0:14:10.369 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:22:56 -0400 (0:00:00.956) 0:14:11.325 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:22:56 -0400 (0:00:00.230) 0:14:11.556 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471731.8200893, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471731.8200893, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266656, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471731.8200893, "nlink": 1, "path": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:22:58 -0400 (0:00:01.594) 0:14:13.151 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:23:02 -0400 (0:00:04.619) 0:14:17.770 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.011578", "end": "2026-04-17 20:23:03.837577", "rc": 0, "start": "2026-04-17 20:23:03.825999" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: bbf9ee95-c86e-4f81-a243-062ccbb378c9 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 902397 Threads: 2 Salt: 77 af 2e e2 bf 43 ed 3a 7c 29 6d 6f fe 74 f3 38 07 84 7b fc e6 40 ce 1b 5b af 9e a6 9a 98 8b 48 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 119156 Salt: 17 a6 e4 fd 50 c1 aa fd 63 cb ca 34 d1 7c ea 56 45 ea 51 1d fb 88 04 ce 14 aa fa 08 72 ac 62 e1 Digest: ba e1 df 47 33 43 19 51 6a 60 23 3d 11 38 77 71 32 f7 2b 41 8e 1c 47 2f 6d a8 05 cc cb 87 25 34 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:23:04 -0400 (0:00:01.393) 0:14:19.164 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:23:04 -0400 (0:00:00.327) 0:14:19.491 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:23:04 -0400 (0:00:00.260) 0:14:19.752 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:23:05 -0400 (0:00:00.304) 0:14:20.057 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:23:05 -0400 (0:00:00.204) 0:14:20.262 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:23:05 -0400 (0:00:00.461) 0:14:20.723 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:23:05 -0400 (0:00:00.187) 0:14:20.911 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:23:06 -0400 (0:00:00.223) 0:14:21.134 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 /dev/sda1 /tmp/storage_testtvp92ntnlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testtvp92ntnlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:23:06 -0400 (0:00:00.353) 0:14:21.488 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:23:06 -0400 (0:00:00.249) 0:14:21.738 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:23:06 -0400 (0:00:00.175) 0:14:21.914 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:23:07 -0400 (0:00:00.261) 0:14:22.175 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:23:07 -0400 (0:00:00.336) 0:14:22.512 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:23:07 -0400 (0:00:00.332) 0:14:22.845 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:23:08 -0400 (0:00:00.251) 0:14:23.096 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:23:08 -0400 (0:00:00.273) 0:14:23.370 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:23:08 -0400 (0:00:00.348) 0:14:23.718 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:23:08 -0400 (0:00:00.279) 0:14:23.997 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:23:09 -0400 (0:00:00.235) 0:14:24.233 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:23:09 -0400 (0:00:00.208) 0:14:24.442 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:23:09 -0400 (0:00:00.306) 0:14:24.748 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:23:10 -0400 (0:00:00.434) 0:14:25.183 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:23:10 -0400 (0:00:00.315) 0:14:25.498 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:23:10 -0400 (0:00:00.198) 0:14:25.697 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:23:11 -0400 (0:00:00.319) 0:14:26.016 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:23:11 -0400 (0:00:00.344) 0:14:26.360 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:23:11 -0400 (0:00:00.245) 0:14:26.606 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:23:11 -0400 (0:00:00.309) 0:14:26.915 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:23:12 -0400 (0:00:00.251) 0:14:27.167 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:23:12 -0400 (0:00:00.345) 0:14:27.512 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:23:12 -0400 (0:00:00.295) 0:14:27.808 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:23:13 -0400 (0:00:00.250) 0:14:28.059 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:23:13 -0400 (0:00:00.232) 0:14:28.291 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:23:13 -0400 (0:00:00.203) 0:14:28.495 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:23:13 -0400 (0:00:00.168) 0:14:28.663 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:23:13 -0400 (0:00:00.261) 0:14:28.925 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:23:14 -0400 (0:00:00.160) 0:14:29.086 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:23:14 -0400 (0:00:00.205) 0:14:29.292 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:23:14 -0400 (0:00:00.343) 0:14:29.636 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:23:14 -0400 (0:00:00.185) 0:14:29.821 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:23:14 -0400 (0:00:00.192) 0:14:30.014 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:23:15 -0400 (0:00:00.253) 0:14:30.268 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:23:15 -0400 (0:00:00.234) 0:14:30.502 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:23:15 -0400 (0:00:00.198) 0:14:30.701 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:23:15 -0400 (0:00:00.250) 0:14:30.952 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:23:16 -0400 (0:00:00.205) 0:14:31.157 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:23:16 -0400 (0:00:00.214) 0:14:31.371 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:23:16 -0400 (0:00:00.335) 0:14:31.706 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:23:16 -0400 (0:00:00.265) 0:14:31.972 ********** ok: [managed-node12] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:23:17 -0400 (0:00:00.268) 0:14:32.240 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:23:17 -0400 (0:00:00.114) 0:14:32.355 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:23:17 -0400 (0:00:00.242) 0:14:32.598 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:23:17 -0400 (0:00:00.242) 0:14:32.841 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:23:18 -0400 (0:00:00.175) 0:14:33.016 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:23:18 -0400 (0:00:00.297) 0:14:33.314 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:23:18 -0400 (0:00:00.184) 0:14:33.499 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:23:18 -0400 (0:00:00.240) 0:14:33.739 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:23:18 -0400 (0:00:00.147) 0:14:33.886 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:23:19 -0400 (0:00:00.161) 0:14:34.048 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:23:19 -0400 (0:00:00.176) 0:14:34.224 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:351 Friday 17 April 2026 20:23:19 -0400 (0:00:00.079) 0:14:34.303 ********** ok: [managed-node12] => { "changed": false, "path": "/tmp/storage_testtvp92ntnlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:361 Friday 17 April 2026 20:23:20 -0400 (0:00:01.259) 0:14:35.563 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:23:20 -0400 (0:00:00.203) 0:14:35.766 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:23:20 -0400 (0:00:00.233) 0:14:36.000 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:23:21 -0400 (0:00:00.289) 0:14:36.290 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:23:21 -0400 (0:00:00.129) 0:14:36.419 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:23:21 -0400 (0:00:00.220) 0:14:36.639 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:23:23 -0400 (0:00:01.648) 0:14:38.287 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:23:23 -0400 (0:00:00.170) 0:14:38.458 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:23:25 -0400 (0:00:01.869) 0:14:40.328 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:23:25 -0400 (0:00:00.415) 0:14:40.744 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:23:25 -0400 (0:00:00.259) 0:14:41.003 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:23:26 -0400 (0:00:00.211) 0:14:41.215 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:23:26 -0400 (0:00:00.222) 0:14:41.438 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:23:26 -0400 (0:00:00.195) 0:14:41.633 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:23:27 -0400 (0:00:00.439) 0:14:42.072 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:23:27 -0400 (0:00:00.167) 0:14:42.239 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:23:27 -0400 (0:00:00.246) 0:14:42.485 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:23:31 -0400 (0:00:04.176) 0:14:46.662 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:23:31 -0400 (0:00:00.194) 0:14:46.857 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:23:32 -0400 (0:00:00.249) 0:14:47.107 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:23:38 -0400 (0:00:05.926) 0:14:53.033 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:23:38 -0400 (0:00:00.360) 0:14:53.394 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:23:38 -0400 (0:00:00.152) 0:14:53.547 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:23:38 -0400 (0:00:00.199) 0:14:53.746 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:23:38 -0400 (0:00:00.189) 0:14:53.936 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:23:43 -0400 (0:00:04.363) 0:14:58.300 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:23:46 -0400 (0:00:03.189) 0:15:01.489 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:23:46 -0400 (0:00:00.304) 0:15:01.794 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:23:52 -0400 (0:00:05.523) 0:15:07.317 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:23:52 -0400 (0:00:00.152) 0:15:07.470 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:23:52 -0400 (0:00:00.177) 0:15:07.647 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:23:52 -0400 (0:00:00.097) 0:15:07.745 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:23:52 -0400 (0:00:00.148) 0:15:07.894 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:380 Friday 17 April 2026 20:23:53 -0400 (0:00:00.132) 0:15:08.026 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:23:53 -0400 (0:00:00.208) 0:15:08.234 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:23:53 -0400 (0:00:00.198) 0:15:08.433 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:23:53 -0400 (0:00:00.108) 0:15:08.541 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:23:55 -0400 (0:00:01.570) 0:15:10.112 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:23:55 -0400 (0:00:00.181) 0:15:10.294 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:23:57 -0400 (0:00:02.037) 0:15:12.331 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:23:57 -0400 (0:00:00.407) 0:15:12.739 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:23:57 -0400 (0:00:00.196) 0:15:12.935 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:23:58 -0400 (0:00:00.182) 0:15:13.118 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:23:58 -0400 (0:00:00.139) 0:15:13.258 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:23:58 -0400 (0:00:00.144) 0:15:13.402 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:23:58 -0400 (0:00:00.239) 0:15:13.642 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:23:58 -0400 (0:00:00.190) 0:15:13.832 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:23:59 -0400 (0:00:00.271) 0:15:14.104 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:24:03 -0400 (0:00:04.164) 0:15:18.268 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:24:03 -0400 (0:00:00.247) 0:15:18.516 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:24:03 -0400 (0:00:00.141) 0:15:18.657 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:24:08 -0400 (0:00:04.966) 0:15:23.624 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:24:08 -0400 (0:00:00.289) 0:15:23.913 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:24:09 -0400 (0:00:00.115) 0:15:24.029 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:24:09 -0400 (0:00:00.177) 0:15:24.207 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:24:09 -0400 (0:00:00.137) 0:15:24.344 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:24:13 -0400 (0:00:03.977) 0:15:28.321 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:24:16 -0400 (0:00:03.153) 0:15:31.475 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:24:16 -0400 (0:00:00.262) 0:15:31.737 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:24:31 -0400 (0:00:14.543) 0:15:46.281 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:24:31 -0400 (0:00:00.220) 0:15:46.502 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471741.8241303, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2d9645b963982d7a4ebd6b812f88e46bde983c01", "ctime": 1776471741.8211303, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471741.8211303, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:24:32 -0400 (0:00:01.348) 0:15:47.850 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:24:34 -0400 (0:00:01.265) 0:15:49.116 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:24:34 -0400 (0:00:00.263) 0:15:49.380 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:24:34 -0400 (0:00:00.290) 0:15:49.671 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:24:34 -0400 (0:00:00.223) 0:15:49.894 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:24:35 -0400 (0:00:00.223) 0:15:50.118 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:24:36 -0400 (0:00:01.478) 0:15:51.596 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:24:38 -0400 (0:00:01.876) 0:15:53.473 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:24:40 -0400 (0:00:01.711) 0:15:55.184 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:24:40 -0400 (0:00:00.254) 0:15:55.439 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:24:42 -0400 (0:00:01.961) 0:15:57.400 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471753.7901793, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e34021dffe0539b65d6a0af10df657ad611c1e07", "ctime": 1776471746.9421513, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 180355269, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471746.9411514, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "720936498", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:24:43 -0400 (0:00:01.390) 0:15:58.791 ********** changed: [managed-node12] => (item={'backing_device': '/dev/sda1', 'name': 'luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node12] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:24:46 -0400 (0:00:02.827) 0:16:01.619 ********** ok: [managed-node12] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:398 Friday 17 April 2026 20:24:48 -0400 (0:00:02.199) 0:16:03.818 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:24:49 -0400 (0:00:00.305) 0:16:04.123 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:24:49 -0400 (0:00:00.280) 0:16:04.403 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:24:49 -0400 (0:00:00.322) 0:16:04.726 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "3d28318b-b762-4e84-b02d-ee21d873f8c1" }, "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "size": "4G", "type": "crypt", "uuid": "d2accc96-18cc-463f-9c79-7d2f4a930393" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:24:51 -0400 (0:00:01.634) 0:16:06.361 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002806", "end": "2026-04-17 20:24:52.636235", "rc": 0, "start": "2026-04-17 20:24:52.633429" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:24:52 -0400 (0:00:01.555) 0:16:07.916 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002584", "end": "2026-04-17 20:24:54.053492", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:24:54.050908" } STDOUT: luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:24:54 -0400 (0:00:01.469) 0:16:09.386 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:24:54 -0400 (0:00:00.341) 0:16:09.727 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:24:54 -0400 (0:00:00.187) 0:16:09.915 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.032019", "end": "2026-04-17 20:24:56.307541", "rc": 0, "start": "2026-04-17 20:24:56.275522" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:24:56 -0400 (0:00:01.727) 0:16:11.643 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:24:56 -0400 (0:00:00.354) 0:16:11.997 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:24:57 -0400 (0:00:00.501) 0:16:12.499 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:24:57 -0400 (0:00:00.380) 0:16:12.880 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:25:01 -0400 (0:00:03.428) 0:16:16.308 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:25:01 -0400 (0:00:00.277) 0:16:16.586 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:25:01 -0400 (0:00:00.287) 0:16:16.873 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:25:02 -0400 (0:00:00.326) 0:16:17.200 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:25:02 -0400 (0:00:00.217) 0:16:17.417 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:25:02 -0400 (0:00:00.218) 0:16:17.635 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:25:02 -0400 (0:00:00.253) 0:16:17.889 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:25:03 -0400 (0:00:00.388) 0:16:18.277 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:25:04 -0400 (0:00:01.531) 0:16:19.808 ********** skipping: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:25:05 -0400 (0:00:00.273) 0:16:20.082 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:25:05 -0400 (0:00:00.553) 0:16:20.636 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:25:05 -0400 (0:00:00.312) 0:16:20.948 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:25:06 -0400 (0:00:00.322) 0:16:21.270 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:25:06 -0400 (0:00:00.352) 0:16:21.623 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:25:06 -0400 (0:00:00.244) 0:16:21.868 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:25:07 -0400 (0:00:00.221) 0:16:22.089 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:25:07 -0400 (0:00:00.342) 0:16:22.432 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:25:07 -0400 (0:00:00.318) 0:16:22.750 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:25:08 -0400 (0:00:00.331) 0:16:23.081 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:25:08 -0400 (0:00:00.330) 0:16:23.412 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:25:08 -0400 (0:00:00.331) 0:16:23.743 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:25:08 -0400 (0:00:00.214) 0:16:23.958 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:25:09 -0400 (0:00:00.561) 0:16:24.520 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node12 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:25:09 -0400 (0:00:00.402) 0:16:24.923 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:25:10 -0400 (0:00:00.273) 0:16:25.197 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:25:10 -0400 (0:00:00.269) 0:16:25.466 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:25:10 -0400 (0:00:00.323) 0:16:25.790 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:25:11 -0400 (0:00:00.311) 0:16:26.102 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:25:11 -0400 (0:00:00.310) 0:16:26.413 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:25:11 -0400 (0:00:00.306) 0:16:26.720 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:25:11 -0400 (0:00:00.228) 0:16:26.949 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:25:12 -0400 (0:00:00.436) 0:16:27.385 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node12 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:25:12 -0400 (0:00:00.338) 0:16:27.724 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:25:12 -0400 (0:00:00.154) 0:16:27.878 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:25:13 -0400 (0:00:00.848) 0:16:28.726 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:25:14 -0400 (0:00:00.297) 0:16:29.024 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:25:14 -0400 (0:00:00.220) 0:16:29.244 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:25:14 -0400 (0:00:00.481) 0:16:29.727 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:25:14 -0400 (0:00:00.218) 0:16:29.945 ********** skipping: [managed-node12] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:25:15 -0400 (0:00:00.413) 0:16:30.358 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node12 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:25:15 -0400 (0:00:00.462) 0:16:30.820 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:25:16 -0400 (0:00:00.288) 0:16:31.109 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:25:16 -0400 (0:00:00.253) 0:16:31.362 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:25:16 -0400 (0:00:00.234) 0:16:31.597 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:25:16 -0400 (0:00:00.344) 0:16:31.941 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:25:17 -0400 (0:00:00.363) 0:16:32.305 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:25:17 -0400 (0:00:00.269) 0:16:32.575 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:25:17 -0400 (0:00:00.314) 0:16:32.889 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:25:18 -0400 (0:00:00.772) 0:16:33.662 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node12 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:25:19 -0400 (0:00:00.456) 0:16:34.118 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:25:19 -0400 (0:00:00.467) 0:16:34.586 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:25:19 -0400 (0:00:00.307) 0:16:34.893 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:25:20 -0400 (0:00:00.308) 0:16:35.201 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:25:20 -0400 (0:00:00.286) 0:16:35.487 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:25:20 -0400 (0:00:00.336) 0:16:35.824 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:25:21 -0400 (0:00:00.309) 0:16:36.134 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:25:21 -0400 (0:00:00.236) 0:16:36.370 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:25:21 -0400 (0:00:00.574) 0:16:36.945 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:25:22 -0400 (0:00:00.271) 0:16:37.217 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:25:22 -0400 (0:00:00.245) 0:16:37.462 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:25:22 -0400 (0:00:00.285) 0:16:37.747 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:25:23 -0400 (0:00:00.296) 0:16:38.044 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:25:23 -0400 (0:00:00.187) 0:16:38.232 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:25:23 -0400 (0:00:00.236) 0:16:38.468 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:25:23 -0400 (0:00:00.165) 0:16:38.634 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:25:23 -0400 (0:00:00.135) 0:16:38.769 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:25:24 -0400 (0:00:00.494) 0:16:39.264 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:25:24 -0400 (0:00:00.230) 0:16:39.494 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:25:25 -0400 (0:00:01.088) 0:16:40.583 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:25:25 -0400 (0:00:00.257) 0:16:40.841 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:25:26 -0400 (0:00:00.243) 0:16:41.085 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:25:26 -0400 (0:00:00.395) 0:16:41.480 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:25:26 -0400 (0:00:00.284) 0:16:41.765 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:25:27 -0400 (0:00:00.266) 0:16:42.032 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:25:27 -0400 (0:00:00.357) 0:16:42.390 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:25:27 -0400 (0:00:00.237) 0:16:42.627 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:25:27 -0400 (0:00:00.221) 0:16:42.849 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:25:28 -0400 (0:00:00.364) 0:16:43.213 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:25:28 -0400 (0:00:00.247) 0:16:43.461 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:25:28 -0400 (0:00:00.173) 0:16:43.634 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:25:29 -0400 (0:00:00.572) 0:16:44.206 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:25:29 -0400 (0:00:00.317) 0:16:44.524 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:25:29 -0400 (0:00:00.315) 0:16:44.839 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:25:30 -0400 (0:00:00.276) 0:16:45.116 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:25:30 -0400 (0:00:00.334) 0:16:45.451 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:25:30 -0400 (0:00:00.202) 0:16:45.654 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:25:30 -0400 (0:00:00.350) 0:16:46.004 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:25:31 -0400 (0:00:00.311) 0:16:46.316 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471870.7536523, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471870.7536523, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 281322, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471870.7536523, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:25:32 -0400 (0:00:01.452) 0:16:47.768 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:25:33 -0400 (0:00:00.289) 0:16:48.058 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:25:33 -0400 (0:00:00.317) 0:16:48.376 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:25:33 -0400 (0:00:00.317) 0:16:48.693 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:25:33 -0400 (0:00:00.301) 0:16:48.994 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:25:34 -0400 (0:00:00.271) 0:16:49.265 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:25:34 -0400 (0:00:00.374) 0:16:49.640 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471870.9176528, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471870.9176528, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 282565, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471870.9176528, "nlink": 1, "path": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:25:36 -0400 (0:00:01.838) 0:16:51.479 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:25:40 -0400 (0:00:04.352) 0:16:55.832 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010383", "end": "2026-04-17 20:25:42.077810", "rc": 0, "start": "2026-04-17 20:25:42.067427" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 3d28318b-b762-4e84-b02d-ee21d873f8c1 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 899422 Threads: 2 Salt: e3 b1 0c a4 ef 39 06 ee 21 e9 5a 1b 20 ad 67 61 62 4d 99 34 51 f1 a8 8d 5c 19 c5 f3 00 b4 12 57 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: d1 02 37 68 d8 2a e7 38 46 e2 b5 02 cd ed 2e 55 17 d0 81 5f 1a 15 7b ec c5 ad 76 dc 66 77 bc 34 Digest: 7d 77 2f 1d 20 8e 29 70 7a 0a 16 ca 21 e3 f0 89 08 9c df 03 82 39 87 cf 68 7e 07 9c 91 ab d4 8f TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:25:42 -0400 (0:00:01.522) 0:16:57.354 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:25:42 -0400 (0:00:00.237) 0:16:57.591 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:25:42 -0400 (0:00:00.191) 0:16:57.783 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:25:42 -0400 (0:00:00.215) 0:16:57.998 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:25:43 -0400 (0:00:00.165) 0:16:58.164 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:25:43 -0400 (0:00:00.292) 0:16:58.456 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:25:43 -0400 (0:00:00.332) 0:16:58.789 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:25:44 -0400 (0:00:00.374) 0:16:59.163 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:25:44 -0400 (0:00:00.288) 0:16:59.452 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:25:44 -0400 (0:00:00.307) 0:16:59.759 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:25:45 -0400 (0:00:00.406) 0:17:00.166 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:25:45 -0400 (0:00:00.313) 0:17:00.479 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:25:45 -0400 (0:00:00.261) 0:17:00.740 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:25:45 -0400 (0:00:00.210) 0:17:00.951 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:25:46 -0400 (0:00:00.298) 0:17:01.250 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:25:46 -0400 (0:00:00.194) 0:17:01.444 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:25:46 -0400 (0:00:00.262) 0:17:01.707 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:25:46 -0400 (0:00:00.232) 0:17:01.940 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:25:47 -0400 (0:00:00.194) 0:17:02.134 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:25:47 -0400 (0:00:00.228) 0:17:02.363 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:25:47 -0400 (0:00:00.246) 0:17:02.609 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:25:47 -0400 (0:00:00.206) 0:17:02.816 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:25:48 -0400 (0:00:00.299) 0:17:03.115 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:25:48 -0400 (0:00:00.132) 0:17:03.247 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:25:52 -0400 (0:00:03.841) 0:17:07.089 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:25:53 -0400 (0:00:01.730) 0:17:08.820 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:25:54 -0400 (0:00:00.299) 0:17:09.119 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:25:54 -0400 (0:00:00.284) 0:17:09.403 ********** ok: [managed-node12] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:25:55 -0400 (0:00:01.600) 0:17:11.003 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:25:56 -0400 (0:00:00.306) 0:17:11.310 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:25:56 -0400 (0:00:00.288) 0:17:11.599 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:25:56 -0400 (0:00:00.254) 0:17:11.853 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:25:57 -0400 (0:00:00.207) 0:17:12.061 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:25:57 -0400 (0:00:00.185) 0:17:12.247 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:25:57 -0400 (0:00:00.217) 0:17:12.465 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:25:57 -0400 (0:00:00.226) 0:17:12.691 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:25:57 -0400 (0:00:00.212) 0:17:12.904 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:25:58 -0400 (0:00:00.231) 0:17:13.135 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:25:58 -0400 (0:00:00.232) 0:17:13.367 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:25:58 -0400 (0:00:00.198) 0:17:13.566 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:25:58 -0400 (0:00:00.158) 0:17:13.724 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:25:58 -0400 (0:00:00.218) 0:17:13.942 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:25:59 -0400 (0:00:00.171) 0:17:14.113 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:25:59 -0400 (0:00:00.176) 0:17:14.290 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:25:59 -0400 (0:00:00.188) 0:17:14.479 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:25:59 -0400 (0:00:00.284) 0:17:14.763 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:25:59 -0400 (0:00:00.208) 0:17:14.972 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:26:00 -0400 (0:00:00.239) 0:17:15.211 ********** ok: [managed-node12] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:26:00 -0400 (0:00:00.176) 0:17:15.388 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:26:00 -0400 (0:00:00.241) 0:17:15.630 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:26:00 -0400 (0:00:00.176) 0:17:15.806 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024578", "end": "2026-04-17 20:26:01.883952", "rc": 0, "start": "2026-04-17 20:26:01.859374" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:26:02 -0400 (0:00:01.512) 0:17:17.319 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:26:02 -0400 (0:00:00.282) 0:17:17.601 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:26:02 -0400 (0:00:00.294) 0:17:17.896 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:26:03 -0400 (0:00:00.265) 0:17:18.162 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:26:03 -0400 (0:00:00.286) 0:17:18.449 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:26:03 -0400 (0:00:00.279) 0:17:18.728 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:26:04 -0400 (0:00:00.293) 0:17:19.021 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:26:04 -0400 (0:00:00.267) 0:17:19.289 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:26:04 -0400 (0:00:00.206) 0:17:19.495 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:401 Friday 17 April 2026 20:26:04 -0400 (0:00:00.253) 0:17:19.749 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:26:05 -0400 (0:00:00.434) 0:17:20.183 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:26:05 -0400 (0:00:00.265) 0:17:20.449 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:26:05 -0400 (0:00:00.344) 0:17:20.793 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:26:08 -0400 (0:00:02.235) 0:17:23.028 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:26:08 -0400 (0:00:00.198) 0:17:23.226 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:26:10 -0400 (0:00:02.032) 0:17:25.259 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:26:10 -0400 (0:00:00.558) 0:17:25.818 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:26:11 -0400 (0:00:00.377) 0:17:26.196 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:26:11 -0400 (0:00:00.350) 0:17:26.546 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:26:11 -0400 (0:00:00.222) 0:17:26.769 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:26:11 -0400 (0:00:00.163) 0:17:26.933 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:26:12 -0400 (0:00:00.610) 0:17:27.543 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:26:12 -0400 (0:00:00.210) 0:17:27.754 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:26:12 -0400 (0:00:00.150) 0:17:27.905 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:26:17 -0400 (0:00:04.745) 0:17:32.651 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:26:17 -0400 (0:00:00.247) 0:17:32.898 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:26:18 -0400 (0:00:00.287) 0:17:33.186 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:26:23 -0400 (0:00:05.752) 0:17:38.938 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:26:24 -0400 (0:00:00.321) 0:17:39.259 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:26:24 -0400 (0:00:00.200) 0:17:39.460 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:26:24 -0400 (0:00:00.140) 0:17:39.600 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:26:24 -0400 (0:00:00.187) 0:17:39.787 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:26:29 -0400 (0:00:04.319) 0:17:44.106 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service": { "name": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service": { "name": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:26:32 -0400 (0:00:03.748) 0:17:47.855 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2dbbf9ee95\x2dc86e\x2d4f81\x2da243\x2d062ccbb378c9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "name": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "tmp.mount dev-sda1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice -.mount systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 /dev/sda1 /tmp/storage_testtvp92ntnlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-bbf9ee95-c86e-4f81-a243-062ccbb378c9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testtvp92ntnlukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:24:42 EDT", "StateChangeTimestampMonotonic": "2517054717", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dc86e\x2d4f81\x2da243\x2d062ccbb378c9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "name": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:26:36 -0400 (0:00:03.789) 0:17:51.645 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:26:42 -0400 (0:00:05.753) 0:17:57.398 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:26:42 -0400 (0:00:00.177) 0:17:57.575 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471879.9006891, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "893a46067303dab36a768ecc72ffb08ae7b19332", "ctime": 1776471879.8966892, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471879.8966892, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:26:43 -0400 (0:00:01.245) 0:17:58.821 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:26:43 -0400 (0:00:00.147) 0:17:58.969 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2dbbf9ee95\x2dc86e\x2d4f81\x2da243\x2d062ccbb378c9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "name": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dbbf9ee95\\x2dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...dc86e\x2d4f81\x2da243\x2d062ccbb378c9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "name": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc86e\\x2d4f81\\x2da243\\x2d062ccbb378c9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:26:47 -0400 (0:00:03.693) 0:18:02.662 ********** ok: [managed-node12] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:26:47 -0400 (0:00:00.206) 0:18:02.868 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:26:48 -0400 (0:00:00.274) 0:18:03.142 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:26:48 -0400 (0:00:00.168) 0:18:03.311 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:26:48 -0400 (0:00:00.147) 0:18:03.459 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:26:49 -0400 (0:00:01.140) 0:18:04.599 ********** ok: [managed-node12] => (item={'src': '/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:26:50 -0400 (0:00:01.104) 0:18:05.703 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:26:50 -0400 (0:00:00.160) 0:18:05.863 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:26:52 -0400 (0:00:01.283) 0:18:07.147 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471894.0517461, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2635c6ab821098838485c4f4f7598f91ccd7c580", "ctime": 1776471886.4387155, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 318767236, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471886.4377155, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1051951051", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:26:53 -0400 (0:00:01.369) 0:18:08.517 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:26:53 -0400 (0:00:00.116) 0:18:08.633 ********** ok: [managed-node12] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:414 Friday 17 April 2026 20:26:55 -0400 (0:00:01.949) 0:18:10.583 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:421 Friday 17 April 2026 20:26:55 -0400 (0:00:00.272) 0:18:10.856 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:26:56 -0400 (0:00:00.451) 0:18:11.308 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:26:56 -0400 (0:00:00.259) 0:18:11.567 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:26:56 -0400 (0:00:00.124) 0:18:11.691 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "3d28318b-b762-4e84-b02d-ee21d873f8c1" }, "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "size": "4G", "type": "crypt", "uuid": "d2accc96-18cc-463f-9c79-7d2f4a930393" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:26:57 -0400 (0:00:01.243) 0:18:12.935 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003042", "end": "2026-04-17 20:26:58.675620", "rc": 0, "start": "2026-04-17 20:26:58.672578" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:26:58 -0400 (0:00:00.999) 0:18:13.934 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002995", "end": "2026-04-17 20:27:00.115872", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:27:00.112877" } STDOUT: luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:27:00 -0400 (0:00:01.462) 0:18:15.396 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:27:00 -0400 (0:00:00.288) 0:18:15.685 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:27:00 -0400 (0:00:00.124) 0:18:15.810 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.027125", "end": "2026-04-17 20:27:01.782127", "rc": 0, "start": "2026-04-17 20:27:01.755002" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:27:01 -0400 (0:00:01.132) 0:18:16.942 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:27:02 -0400 (0:00:00.137) 0:18:17.080 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:27:02 -0400 (0:00:00.439) 0:18:17.519 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:27:02 -0400 (0:00:00.196) 0:18:17.715 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:27:04 -0400 (0:00:01.368) 0:18:19.084 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:27:04 -0400 (0:00:00.201) 0:18:19.285 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:27:04 -0400 (0:00:00.128) 0:18:19.414 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:27:04 -0400 (0:00:00.222) 0:18:19.636 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:27:04 -0400 (0:00:00.122) 0:18:19.758 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:27:04 -0400 (0:00:00.187) 0:18:19.945 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:27:05 -0400 (0:00:00.125) 0:18:20.070 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:27:05 -0400 (0:00:00.103) 0:18:20.174 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:27:06 -0400 (0:00:01.329) 0:18:21.504 ********** skipping: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:27:06 -0400 (0:00:00.220) 0:18:21.724 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:27:07 -0400 (0:00:00.412) 0:18:22.137 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:27:07 -0400 (0:00:00.257) 0:18:22.395 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:27:07 -0400 (0:00:00.227) 0:18:22.622 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:27:07 -0400 (0:00:00.120) 0:18:22.742 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:27:07 -0400 (0:00:00.158) 0:18:22.901 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:27:08 -0400 (0:00:00.176) 0:18:23.078 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:27:08 -0400 (0:00:00.166) 0:18:23.245 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:27:08 -0400 (0:00:00.202) 0:18:23.447 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:27:08 -0400 (0:00:00.142) 0:18:23.590 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:27:08 -0400 (0:00:00.206) 0:18:23.796 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:27:09 -0400 (0:00:00.319) 0:18:24.115 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:27:09 -0400 (0:00:00.095) 0:18:24.211 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:27:09 -0400 (0:00:00.426) 0:18:24.638 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node12 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:27:09 -0400 (0:00:00.278) 0:18:24.917 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:27:10 -0400 (0:00:00.105) 0:18:25.022 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:27:10 -0400 (0:00:00.274) 0:18:25.297 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:27:10 -0400 (0:00:00.239) 0:18:25.537 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:27:10 -0400 (0:00:00.176) 0:18:25.713 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:27:10 -0400 (0:00:00.184) 0:18:25.898 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:27:11 -0400 (0:00:00.227) 0:18:26.125 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:27:11 -0400 (0:00:00.174) 0:18:26.300 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:27:11 -0400 (0:00:00.426) 0:18:26.726 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node12 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:27:11 -0400 (0:00:00.241) 0:18:26.968 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:27:12 -0400 (0:00:00.052) 0:18:27.020 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:27:12 -0400 (0:00:00.303) 0:18:27.324 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:27:12 -0400 (0:00:00.345) 0:18:27.669 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:27:12 -0400 (0:00:00.203) 0:18:27.873 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:27:13 -0400 (0:00:00.501) 0:18:28.375 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:27:13 -0400 (0:00:00.230) 0:18:28.606 ********** skipping: [managed-node12] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:27:13 -0400 (0:00:00.197) 0:18:28.803 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node12 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:27:13 -0400 (0:00:00.192) 0:18:28.995 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:27:14 -0400 (0:00:00.170) 0:18:29.166 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:27:14 -0400 (0:00:00.219) 0:18:29.386 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:27:14 -0400 (0:00:00.240) 0:18:29.626 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:27:14 -0400 (0:00:00.222) 0:18:29.848 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:27:15 -0400 (0:00:00.178) 0:18:30.027 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:27:15 -0400 (0:00:00.229) 0:18:30.256 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:27:15 -0400 (0:00:00.168) 0:18:30.425 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:27:15 -0400 (0:00:00.372) 0:18:30.798 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node12 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:27:16 -0400 (0:00:00.347) 0:18:31.146 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:27:16 -0400 (0:00:00.113) 0:18:31.260 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:27:16 -0400 (0:00:00.169) 0:18:31.429 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:27:16 -0400 (0:00:00.173) 0:18:31.603 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:27:16 -0400 (0:00:00.157) 0:18:31.761 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:27:16 -0400 (0:00:00.175) 0:18:31.937 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:27:17 -0400 (0:00:00.139) 0:18:32.076 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:27:17 -0400 (0:00:00.097) 0:18:32.174 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:27:17 -0400 (0:00:00.350) 0:18:32.524 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:27:17 -0400 (0:00:00.150) 0:18:32.675 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:27:17 -0400 (0:00:00.203) 0:18:32.878 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:27:18 -0400 (0:00:00.298) 0:18:33.177 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:27:18 -0400 (0:00:00.232) 0:18:33.410 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:27:18 -0400 (0:00:00.208) 0:18:33.619 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:27:18 -0400 (0:00:00.139) 0:18:33.758 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:27:19 -0400 (0:00:00.276) 0:18:34.034 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:27:19 -0400 (0:00:00.180) 0:18:34.215 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:27:19 -0400 (0:00:00.268) 0:18:34.484 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:27:19 -0400 (0:00:00.256) 0:18:34.740 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:27:20 -0400 (0:00:01.053) 0:18:35.794 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:27:21 -0400 (0:00:00.280) 0:18:36.074 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:27:21 -0400 (0:00:00.242) 0:18:36.317 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:27:21 -0400 (0:00:00.280) 0:18:36.597 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:27:21 -0400 (0:00:00.254) 0:18:36.852 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:27:22 -0400 (0:00:00.179) 0:18:37.031 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:27:22 -0400 (0:00:00.291) 0:18:37.323 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:27:22 -0400 (0:00:00.325) 0:18:37.649 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:27:22 -0400 (0:00:00.159) 0:18:37.808 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:27:22 -0400 (0:00:00.202) 0:18:38.011 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:27:23 -0400 (0:00:00.180) 0:18:38.191 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:27:23 -0400 (0:00:00.306) 0:18:38.497 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:27:24 -0400 (0:00:00.563) 0:18:39.061 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:27:24 -0400 (0:00:00.204) 0:18:39.265 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:27:24 -0400 (0:00:00.188) 0:18:39.454 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:27:24 -0400 (0:00:00.239) 0:18:39.693 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:27:24 -0400 (0:00:00.270) 0:18:39.963 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:27:25 -0400 (0:00:00.254) 0:18:40.217 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:27:25 -0400 (0:00:00.458) 0:18:40.676 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:27:25 -0400 (0:00:00.314) 0:18:40.991 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471942.0729399, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471870.7536523, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 281322, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471870.7536523, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:27:27 -0400 (0:00:01.574) 0:18:42.565 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:27:27 -0400 (0:00:00.302) 0:18:42.868 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:27:28 -0400 (0:00:00.294) 0:18:43.162 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:27:28 -0400 (0:00:00.278) 0:18:43.441 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:27:28 -0400 (0:00:00.206) 0:18:43.647 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:27:28 -0400 (0:00:00.241) 0:18:43.888 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:27:29 -0400 (0:00:00.250) 0:18:44.139 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472001.9761813, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776471870.9176528, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 282565, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776471870.9176528, "nlink": 1, "path": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:27:30 -0400 (0:00:01.387) 0:18:45.526 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:27:34 -0400 (0:00:04.305) 0:18:49.832 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.011533", "end": "2026-04-17 20:27:36.092909", "rc": 0, "start": "2026-04-17 20:27:36.081376" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 3d28318b-b762-4e84-b02d-ee21d873f8c1 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 899422 Threads: 2 Salt: e3 b1 0c a4 ef 39 06 ee 21 e9 5a 1b 20 ad 67 61 62 4d 99 34 51 f1 a8 8d 5c 19 c5 f3 00 b4 12 57 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: d1 02 37 68 d8 2a e7 38 46 e2 b5 02 cd ed 2e 55 17 d0 81 5f 1a 15 7b ec c5 ad 76 dc 66 77 bc 34 Digest: 7d 77 2f 1d 20 8e 29 70 7a 0a 16 ca 21 e3 f0 89 08 9c df 03 82 39 87 cf 68 7e 07 9c 91 ab d4 8f TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:27:36 -0400 (0:00:01.535) 0:18:51.367 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:27:37 -0400 (0:00:00.860) 0:18:52.227 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:27:37 -0400 (0:00:00.223) 0:18:52.451 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:27:37 -0400 (0:00:00.174) 0:18:52.626 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:27:37 -0400 (0:00:00.177) 0:18:52.803 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:27:38 -0400 (0:00:00.216) 0:18:53.020 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:27:38 -0400 (0:00:00.265) 0:18:53.285 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:27:38 -0400 (0:00:00.220) 0:18:53.505 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:27:38 -0400 (0:00:00.338) 0:18:53.844 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:27:39 -0400 (0:00:00.236) 0:18:54.080 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:27:39 -0400 (0:00:00.250) 0:18:54.331 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:27:39 -0400 (0:00:00.331) 0:18:54.662 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:27:39 -0400 (0:00:00.281) 0:18:54.944 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:27:40 -0400 (0:00:00.132) 0:18:55.077 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:27:40 -0400 (0:00:00.259) 0:18:55.336 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:27:40 -0400 (0:00:00.232) 0:18:55.569 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:27:40 -0400 (0:00:00.260) 0:18:55.830 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:27:41 -0400 (0:00:00.214) 0:18:56.044 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:27:41 -0400 (0:00:00.252) 0:18:56.296 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:27:41 -0400 (0:00:00.219) 0:18:56.516 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:27:41 -0400 (0:00:00.246) 0:18:56.763 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:27:42 -0400 (0:00:00.300) 0:18:57.063 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:27:42 -0400 (0:00:00.249) 0:18:57.313 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:27:42 -0400 (0:00:00.173) 0:18:57.486 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:27:44 -0400 (0:00:01.787) 0:18:59.274 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:27:45 -0400 (0:00:01.629) 0:19:00.903 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:27:46 -0400 (0:00:00.302) 0:19:01.205 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:27:46 -0400 (0:00:00.230) 0:19:01.436 ********** ok: [managed-node12] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:27:48 -0400 (0:00:01.614) 0:19:03.051 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:27:48 -0400 (0:00:00.290) 0:19:03.341 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:27:48 -0400 (0:00:00.210) 0:19:03.551 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:27:48 -0400 (0:00:00.261) 0:19:03.813 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:27:49 -0400 (0:00:00.232) 0:19:04.045 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:27:49 -0400 (0:00:00.327) 0:19:04.373 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:27:49 -0400 (0:00:00.251) 0:19:04.625 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:27:49 -0400 (0:00:00.224) 0:19:04.849 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:27:50 -0400 (0:00:00.177) 0:19:05.027 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:27:50 -0400 (0:00:00.249) 0:19:05.276 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:27:50 -0400 (0:00:00.201) 0:19:05.477 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:27:50 -0400 (0:00:00.129) 0:19:05.607 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:27:50 -0400 (0:00:00.161) 0:19:05.769 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:27:50 -0400 (0:00:00.157) 0:19:05.927 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:27:51 -0400 (0:00:00.210) 0:19:06.137 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:27:51 -0400 (0:00:00.168) 0:19:06.306 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:27:51 -0400 (0:00:00.250) 0:19:06.556 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:27:51 -0400 (0:00:00.252) 0:19:06.809 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:27:52 -0400 (0:00:00.277) 0:19:07.087 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:27:52 -0400 (0:00:00.298) 0:19:07.385 ********** ok: [managed-node12] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:27:52 -0400 (0:00:00.190) 0:19:07.576 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:27:52 -0400 (0:00:00.186) 0:19:07.762 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:27:53 -0400 (0:00:00.420) 0:19:08.182 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030598", "end": "2026-04-17 20:27:54.610027", "rc": 0, "start": "2026-04-17 20:27:54.579429" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:27:54 -0400 (0:00:01.748) 0:19:09.931 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:27:55 -0400 (0:00:00.223) 0:19:10.154 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:27:55 -0400 (0:00:00.357) 0:19:10.512 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:27:55 -0400 (0:00:00.319) 0:19:10.831 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:27:56 -0400 (0:00:00.312) 0:19:11.143 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:27:56 -0400 (0:00:00.203) 0:19:11.346 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:27:56 -0400 (0:00:00.225) 0:19:11.571 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:27:56 -0400 (0:00:00.149) 0:19:11.721 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:27:56 -0400 (0:00:00.180) 0:19:11.901 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:27:57 -0400 (0:00:00.190) 0:19:12.092 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:427 Friday 17 April 2026 20:27:58 -0400 (0:00:01.362) 0:19:13.455 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:27:58 -0400 (0:00:00.312) 0:19:13.767 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:27:59 -0400 (0:00:00.257) 0:19:14.024 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:27:59 -0400 (0:00:00.273) 0:19:14.298 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:27:59 -0400 (0:00:00.188) 0:19:14.486 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:27:59 -0400 (0:00:00.245) 0:19:14.731 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:28:01 -0400 (0:00:01.685) 0:19:16.417 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:28:01 -0400 (0:00:00.239) 0:19:16.656 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:28:03 -0400 (0:00:01.995) 0:19:18.652 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:28:04 -0400 (0:00:00.466) 0:19:19.119 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:28:04 -0400 (0:00:00.315) 0:19:19.434 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:28:04 -0400 (0:00:00.163) 0:19:19.598 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:28:04 -0400 (0:00:00.135) 0:19:19.733 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:28:04 -0400 (0:00:00.139) 0:19:19.873 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:28:05 -0400 (0:00:00.347) 0:19:20.220 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:28:05 -0400 (0:00:00.097) 0:19:20.318 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:28:05 -0400 (0:00:00.089) 0:19:20.407 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:28:09 -0400 (0:00:04.050) 0:19:24.458 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:28:09 -0400 (0:00:00.244) 0:19:24.702 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:28:09 -0400 (0:00:00.247) 0:19:24.949 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:28:15 -0400 (0:00:05.742) 0:19:30.692 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:28:15 -0400 (0:00:00.236) 0:19:30.929 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:28:16 -0400 (0:00:00.153) 0:19:31.082 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:28:16 -0400 (0:00:00.200) 0:19:31.283 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:28:16 -0400 (0:00:00.125) 0:19:31.409 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:28:20 -0400 (0:00:04.392) 0:19:35.802 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:28:24 -0400 (0:00:03.459) 0:19:39.262 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:36 EDT", "StateChangeTimestampMonotonic": "2631425386", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:28:28 -0400 (0:00:04.028) 0:19:43.290 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:28:34 -0400 (0:00:06.202) 0:19:49.493 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:28:34 -0400 (0:00:00.313) 0:19:49.806 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:36 EDT", "StateChangeTimestampMonotonic": "2631425386", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:28:38 -0400 (0:00:03.734) 0:19:53.540 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:28:38 -0400 (0:00:00.192) 0:19:53.733 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:28:38 -0400 (0:00:00.262) 0:19:53.995 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:28:39 -0400 (0:00:00.212) 0:19:54.208 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472078.1974885, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472078.1974885, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776472078.1974885, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1701907778", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:28:40 -0400 (0:00:01.637) 0:19:55.845 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:451 Friday 17 April 2026 20:28:41 -0400 (0:00:00.261) 0:19:56.107 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:28:41 -0400 (0:00:00.344) 0:19:56.452 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:28:41 -0400 (0:00:00.256) 0:19:56.709 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:28:41 -0400 (0:00:00.278) 0:19:56.987 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:28:43 -0400 (0:00:01.494) 0:19:58.482 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:28:43 -0400 (0:00:00.126) 0:19:58.609 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:28:45 -0400 (0:00:01.995) 0:20:00.604 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:28:45 -0400 (0:00:00.408) 0:20:01.012 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:28:46 -0400 (0:00:00.352) 0:20:01.365 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:28:46 -0400 (0:00:00.288) 0:20:01.653 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:28:47 -0400 (0:00:00.965) 0:20:02.619 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:28:47 -0400 (0:00:00.190) 0:20:02.810 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:28:48 -0400 (0:00:00.559) 0:20:03.369 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:28:48 -0400 (0:00:00.246) 0:20:03.616 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:28:48 -0400 (0:00:00.079) 0:20:03.695 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:28:53 -0400 (0:00:04.433) 0:20:08.128 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:28:53 -0400 (0:00:00.237) 0:20:08.366 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:28:53 -0400 (0:00:00.219) 0:20:08.586 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:28:59 -0400 (0:00:05.677) 0:20:14.263 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:28:59 -0400 (0:00:00.318) 0:20:14.581 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:28:59 -0400 (0:00:00.179) 0:20:14.761 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:28:59 -0400 (0:00:00.132) 0:20:14.893 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:28:59 -0400 (0:00:00.081) 0:20:14.975 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:29:03 -0400 (0:00:03.501) 0:20:18.476 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:29:06 -0400 (0:00:03.125) 0:20:21.602 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:36 EDT", "StateChangeTimestampMonotonic": "2631425386", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:29:10 -0400 (0:00:03.694) 0:20:25.296 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:29:16 -0400 (0:00:06.307) 0:20:31.604 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:29:16 -0400 (0:00:00.200) 0:20:31.804 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471879.9006891, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "893a46067303dab36a768ecc72ffb08ae7b19332", "ctime": 1776471879.8966892, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776471879.8966892, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:29:18 -0400 (0:00:01.657) 0:20:33.461 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:29:19 -0400 (0:00:01.380) 0:20:34.842 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:36 EDT", "StateChangeTimestampMonotonic": "2631425386", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:29:23 -0400 (0:00:03.812) 0:20:38.654 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:29:23 -0400 (0:00:00.190) 0:20:38.845 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:29:24 -0400 (0:00:00.237) 0:20:39.082 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:29:24 -0400 (0:00:00.270) 0:20:39.353 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3d28318b-b762-4e84-b02d-ee21d873f8c1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:29:25 -0400 (0:00:01.557) 0:20:40.911 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:29:27 -0400 (0:00:01.882) 0:20:42.794 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:29:29 -0400 (0:00:01.593) 0:20:44.387 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:29:29 -0400 (0:00:00.283) 0:20:44.670 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:29:31 -0400 (0:00:01.870) 0:20:46.541 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776471894.0517461, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2635c6ab821098838485c4f4f7598f91ccd7c580", "ctime": 1776471886.4387155, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 318767236, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776471886.4377155, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1051951051", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:29:32 -0400 (0:00:01.338) 0:20:47.879 ********** changed: [managed-node12] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:29:34 -0400 (0:00:01.664) 0:20:49.544 ********** ok: [managed-node12] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:467 Friday 17 April 2026 20:29:36 -0400 (0:00:01.978) 0:20:51.523 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:29:37 -0400 (0:00:00.493) 0:20:52.016 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:29:37 -0400 (0:00:00.181) 0:20:52.198 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:29:37 -0400 (0:00:00.282) 0:20:52.481 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "c80d6eee-d3e5-46e9-9d90-646018c6b75e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:29:39 -0400 (0:00:01.718) 0:20:54.199 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003064", "end": "2026-04-17 20:29:40.513809", "rc": 0, "start": "2026-04-17 20:29:40.510745" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:29:40 -0400 (0:00:01.664) 0:20:55.863 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003012", "end": "2026-04-17 20:29:42.338606", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:29:42.335594" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:29:42 -0400 (0:00:01.784) 0:20:57.648 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:29:43 -0400 (0:00:00.495) 0:20:58.144 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:29:43 -0400 (0:00:00.191) 0:20:58.335 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.025530", "end": "2026-04-17 20:29:44.860036", "rc": 0, "start": "2026-04-17 20:29:44.834506" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:29:45 -0400 (0:00:01.733) 0:21:00.068 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:29:45 -0400 (0:00:00.385) 0:21:00.454 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:29:45 -0400 (0:00:00.513) 0:21:00.968 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:29:46 -0400 (0:00:00.414) 0:21:01.382 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:29:47 -0400 (0:00:01.561) 0:21:02.943 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:29:48 -0400 (0:00:00.223) 0:21:03.166 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:29:48 -0400 (0:00:00.220) 0:21:03.387 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:29:48 -0400 (0:00:00.145) 0:21:03.532 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:29:48 -0400 (0:00:00.243) 0:21:03.776 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:29:49 -0400 (0:00:00.324) 0:21:04.101 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:29:49 -0400 (0:00:00.284) 0:21:04.385 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:29:49 -0400 (0:00:00.299) 0:21:04.685 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:29:51 -0400 (0:00:01.741) 0:21:06.426 ********** skipping: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:29:51 -0400 (0:00:00.296) 0:21:06.723 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:29:52 -0400 (0:00:00.360) 0:21:07.083 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:29:52 -0400 (0:00:00.232) 0:21:07.316 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:29:52 -0400 (0:00:00.355) 0:21:07.683 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:29:52 -0400 (0:00:00.286) 0:21:07.970 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:29:53 -0400 (0:00:00.317) 0:21:08.288 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:29:53 -0400 (0:00:00.230) 0:21:08.518 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:29:53 -0400 (0:00:00.172) 0:21:08.690 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:29:53 -0400 (0:00:00.152) 0:21:08.843 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:29:54 -0400 (0:00:00.221) 0:21:09.064 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:29:54 -0400 (0:00:00.140) 0:21:09.205 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:29:54 -0400 (0:00:00.237) 0:21:09.443 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:29:54 -0400 (0:00:00.210) 0:21:09.653 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:29:55 -0400 (0:00:00.456) 0:21:10.110 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node12 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:29:55 -0400 (0:00:00.383) 0:21:10.494 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:29:56 -0400 (0:00:01.288) 0:21:11.782 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:29:57 -0400 (0:00:00.304) 0:21:12.086 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:29:57 -0400 (0:00:00.321) 0:21:12.408 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:29:57 -0400 (0:00:00.326) 0:21:12.735 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:29:58 -0400 (0:00:00.283) 0:21:13.018 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:29:58 -0400 (0:00:00.308) 0:21:13.327 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:29:58 -0400 (0:00:00.210) 0:21:13.537 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:29:59 -0400 (0:00:00.515) 0:21:14.053 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node12 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:29:59 -0400 (0:00:00.331) 0:21:14.385 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:29:59 -0400 (0:00:00.191) 0:21:14.576 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:29:59 -0400 (0:00:00.218) 0:21:14.795 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:30:00 -0400 (0:00:00.289) 0:21:15.084 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:30:00 -0400 (0:00:00.369) 0:21:15.454 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:30:01 -0400 (0:00:00.622) 0:21:16.076 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:30:01 -0400 (0:00:00.237) 0:21:16.314 ********** skipping: [managed-node12] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:30:01 -0400 (0:00:00.314) 0:21:16.628 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node12 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:30:02 -0400 (0:00:00.476) 0:21:17.105 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:30:02 -0400 (0:00:00.247) 0:21:17.352 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:30:02 -0400 (0:00:00.315) 0:21:17.667 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:30:02 -0400 (0:00:00.266) 0:21:17.934 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:30:03 -0400 (0:00:00.269) 0:21:18.204 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:30:03 -0400 (0:00:00.282) 0:21:18.486 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:30:03 -0400 (0:00:00.253) 0:21:18.740 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:30:03 -0400 (0:00:00.192) 0:21:18.933 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:30:04 -0400 (0:00:00.583) 0:21:19.517 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node12 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:30:04 -0400 (0:00:00.427) 0:21:19.944 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:30:05 -0400 (0:00:00.386) 0:21:20.330 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:30:05 -0400 (0:00:00.244) 0:21:20.575 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:30:05 -0400 (0:00:00.424) 0:21:20.999 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:30:06 -0400 (0:00:00.311) 0:21:21.310 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:30:06 -0400 (0:00:00.399) 0:21:21.709 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:30:06 -0400 (0:00:00.284) 0:21:21.994 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:30:07 -0400 (0:00:00.311) 0:21:22.306 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:30:07 -0400 (0:00:00.703) 0:21:23.010 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:30:08 -0400 (0:00:00.348) 0:21:23.358 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:30:08 -0400 (0:00:00.317) 0:21:23.676 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:30:08 -0400 (0:00:00.222) 0:21:23.898 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:30:09 -0400 (0:00:00.333) 0:21:24.232 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:30:09 -0400 (0:00:00.369) 0:21:24.601 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:30:09 -0400 (0:00:00.349) 0:21:24.951 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:30:10 -0400 (0:00:00.294) 0:21:25.246 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:30:10 -0400 (0:00:00.200) 0:21:25.447 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:30:10 -0400 (0:00:00.343) 0:21:25.790 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:30:11 -0400 (0:00:00.348) 0:21:26.139 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:30:12 -0400 (0:00:01.691) 0:21:27.830 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:30:13 -0400 (0:00:00.250) 0:21:28.081 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:30:13 -0400 (0:00:00.359) 0:21:28.441 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:30:13 -0400 (0:00:00.303) 0:21:28.744 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:30:13 -0400 (0:00:00.208) 0:21:28.953 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:30:14 -0400 (0:00:00.196) 0:21:29.149 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:30:14 -0400 (0:00:00.218) 0:21:29.367 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:30:14 -0400 (0:00:00.262) 0:21:29.629 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:30:14 -0400 (0:00:00.190) 0:21:29.820 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:30:15 -0400 (0:00:00.204) 0:21:30.025 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:30:15 -0400 (0:00:00.238) 0:21:30.264 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:30:15 -0400 (0:00:00.232) 0:21:30.497 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:30:16 -0400 (0:00:00.543) 0:21:31.041 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:30:16 -0400 (0:00:00.332) 0:21:31.373 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:30:16 -0400 (0:00:00.334) 0:21:31.708 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:30:16 -0400 (0:00:00.255) 0:21:31.963 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:30:17 -0400 (0:00:00.348) 0:21:32.311 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:30:17 -0400 (0:00:00.223) 0:21:32.535 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:30:17 -0400 (0:00:00.434) 0:21:32.969 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:30:18 -0400 (0:00:00.194) 0:21:33.164 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472156.1388023, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472156.1388023, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 314888, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472156.1388023, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:30:19 -0400 (0:00:01.468) 0:21:34.632 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:30:19 -0400 (0:00:00.187) 0:21:34.820 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:30:20 -0400 (0:00:00.230) 0:21:35.051 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:30:20 -0400 (0:00:00.132) 0:21:35.184 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:30:20 -0400 (0:00:00.218) 0:21:35.402 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:30:20 -0400 (0:00:00.184) 0:21:35.586 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:30:20 -0400 (0:00:00.212) 0:21:35.799 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:30:20 -0400 (0:00:00.207) 0:21:36.007 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:30:25 -0400 (0:00:04.039) 0:21:40.046 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:30:25 -0400 (0:00:00.149) 0:21:40.196 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:30:25 -0400 (0:00:00.260) 0:21:40.457 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:30:25 -0400 (0:00:00.217) 0:21:40.675 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:30:25 -0400 (0:00:00.181) 0:21:40.857 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:30:26 -0400 (0:00:00.207) 0:21:41.065 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:30:26 -0400 (0:00:00.260) 0:21:41.326 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:30:26 -0400 (0:00:00.213) 0:21:41.539 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:30:26 -0400 (0:00:00.240) 0:21:41.779 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:30:26 -0400 (0:00:00.236) 0:21:42.015 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:30:27 -0400 (0:00:00.166) 0:21:42.182 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:30:27 -0400 (0:00:00.228) 0:21:42.411 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:30:27 -0400 (0:00:00.143) 0:21:42.554 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:30:27 -0400 (0:00:00.177) 0:21:42.732 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:30:27 -0400 (0:00:00.121) 0:21:42.853 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:30:28 -0400 (0:00:00.176) 0:21:43.030 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:30:28 -0400 (0:00:00.226) 0:21:43.256 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:30:28 -0400 (0:00:00.177) 0:21:43.433 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:30:28 -0400 (0:00:00.179) 0:21:43.613 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:30:28 -0400 (0:00:00.176) 0:21:43.789 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:30:28 -0400 (0:00:00.168) 0:21:43.958 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:30:29 -0400 (0:00:00.218) 0:21:44.176 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:30:29 -0400 (0:00:00.223) 0:21:44.400 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:30:29 -0400 (0:00:00.206) 0:21:44.606 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:30:29 -0400 (0:00:00.298) 0:21:44.905 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:30:31 -0400 (0:00:01.479) 0:21:46.384 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:30:32 -0400 (0:00:01.418) 0:21:47.802 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:30:32 -0400 (0:00:00.206) 0:21:48.009 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:30:33 -0400 (0:00:00.164) 0:21:48.173 ********** ok: [managed-node12] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:30:34 -0400 (0:00:01.087) 0:21:49.261 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:30:34 -0400 (0:00:00.348) 0:21:49.610 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:30:34 -0400 (0:00:00.211) 0:21:49.822 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:30:34 -0400 (0:00:00.182) 0:21:50.004 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:30:35 -0400 (0:00:00.186) 0:21:50.190 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:30:35 -0400 (0:00:00.088) 0:21:50.279 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:30:35 -0400 (0:00:00.101) 0:21:50.380 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:30:35 -0400 (0:00:00.192) 0:21:50.572 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:30:35 -0400 (0:00:00.151) 0:21:50.724 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:30:35 -0400 (0:00:00.129) 0:21:50.853 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:30:35 -0400 (0:00:00.159) 0:21:51.013 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:30:36 -0400 (0:00:00.115) 0:21:51.128 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:30:36 -0400 (0:00:00.185) 0:21:51.314 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:30:36 -0400 (0:00:00.151) 0:21:51.465 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:30:36 -0400 (0:00:00.162) 0:21:51.628 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:30:36 -0400 (0:00:00.149) 0:21:51.778 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:30:36 -0400 (0:00:00.223) 0:21:52.001 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:30:37 -0400 (0:00:00.186) 0:21:52.188 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:30:37 -0400 (0:00:00.143) 0:21:52.332 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:30:37 -0400 (0:00:00.122) 0:21:52.454 ********** ok: [managed-node12] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:30:37 -0400 (0:00:00.070) 0:21:52.525 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:30:37 -0400 (0:00:00.111) 0:21:52.637 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:30:37 -0400 (0:00:00.261) 0:21:52.898 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025019", "end": "2026-04-17 20:30:38.724951", "rc": 0, "start": "2026-04-17 20:30:38.699932" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:30:38 -0400 (0:00:01.093) 0:21:53.991 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:30:39 -0400 (0:00:00.109) 0:21:54.101 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:30:39 -0400 (0:00:00.289) 0:21:54.390 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:30:39 -0400 (0:00:00.101) 0:21:54.492 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:30:39 -0400 (0:00:00.147) 0:21:54.640 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:30:40 -0400 (0:00:00.998) 0:21:55.639 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:30:40 -0400 (0:00:00.158) 0:21:55.797 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:30:40 -0400 (0:00:00.203) 0:21:56.000 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:30:41 -0400 (0:00:00.135) 0:21:56.135 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 April 2026 20:30:41 -0400 (0:00:00.112) 0:21:56.248 ********** changed: [managed-node12] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:473 Friday 17 April 2026 20:30:42 -0400 (0:00:00.922) 0:21:57.171 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node12 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 April 2026 20:30:42 -0400 (0:00:00.133) 0:21:57.304 ********** ok: [managed-node12] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 April 2026 20:30:42 -0400 (0:00:00.124) 0:21:57.428 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:30:42 -0400 (0:00:00.121) 0:21:57.550 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:30:42 -0400 (0:00:00.099) 0:21:57.650 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:30:42 -0400 (0:00:00.119) 0:21:57.769 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:30:44 -0400 (0:00:01.246) 0:21:59.016 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:30:44 -0400 (0:00:00.131) 0:21:59.147 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:30:45 -0400 (0:00:01.444) 0:22:00.592 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:30:45 -0400 (0:00:00.127) 0:22:00.719 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:30:45 -0400 (0:00:00.080) 0:22:00.800 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:30:45 -0400 (0:00:00.039) 0:22:00.839 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:30:45 -0400 (0:00:00.168) 0:22:01.007 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:30:46 -0400 (0:00:00.114) 0:22:01.121 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:30:46 -0400 (0:00:00.228) 0:22:01.349 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:30:46 -0400 (0:00:00.100) 0:22:01.450 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:30:46 -0400 (0:00:00.108) 0:22:01.558 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:30:50 -0400 (0:00:03.761) 0:22:05.320 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:30:50 -0400 (0:00:00.203) 0:22:05.524 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:30:50 -0400 (0:00:00.149) 0:22:05.673 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:30:55 -0400 (0:00:04.881) 0:22:10.555 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:30:55 -0400 (0:00:00.218) 0:22:10.773 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:30:55 -0400 (0:00:00.120) 0:22:10.894 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:30:55 -0400 (0:00:00.079) 0:22:10.973 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:30:56 -0400 (0:00:00.070) 0:22:11.043 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:30:59 -0400 (0:00:03.710) 0:22:14.753 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service": { "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:31:01 -0400 (0:00:02.218) 0:22:16.972 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-3d28318b-b762-4e84-b02d-ee21d873f8c1", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-3d28318b-b762-4e84-b02d-ee21d873f8c1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-04-17 20:26:36 EDT", "StateChangeTimestampMonotonic": "2631425386", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:31:04 -0400 (0:00:02.663) 0:22:19.636 ********** fatal: [managed-node12]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Friday 17 April 2026 20:31:09 -0400 (0:00:05.286) 0:22:24.923 ********** fatal: [managed-node12]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:31:10 -0400 (0:00:00.191) 0:22:25.114 ********** changed: [managed-node12] => (item=systemd-cryptsetup@luks\x2d3d28318b\x2db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d3d28318b\\x2db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node12] => (item=systemd-cryptsetup@luk...db762\x2d4e84\x2db02d\x2dee21d873f8c1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "name": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db762\\x2d4e84\\x2db02d\\x2dee21d873f8c1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Friday 17 April 2026 20:31:12 -0400 (0:00:02.297) 0:22:27.412 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Friday 17 April 2026 20:31:12 -0400 (0:00:00.252) 0:22:27.664 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Friday 17 April 2026 20:31:12 -0400 (0:00:00.224) 0:22:27.888 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 April 2026 20:31:12 -0400 (0:00:00.121) 0:22:28.010 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472241.933148, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472241.933148, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776472241.933148, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "609436772", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 April 2026 20:31:14 -0400 (0:00:01.080) 0:22:29.090 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:497 Friday 17 April 2026 20:31:14 -0400 (0:00:00.062) 0:22:29.153 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:31:14 -0400 (0:00:00.142) 0:22:29.296 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:31:14 -0400 (0:00:00.051) 0:22:29.347 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:31:14 -0400 (0:00:00.123) 0:22:29.471 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:31:15 -0400 (0:00:01.178) 0:22:30.649 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:31:15 -0400 (0:00:00.124) 0:22:30.774 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:31:17 -0400 (0:00:01.512) 0:22:32.286 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:31:17 -0400 (0:00:00.321) 0:22:32.608 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:31:17 -0400 (0:00:00.105) 0:22:32.714 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:31:17 -0400 (0:00:00.116) 0:22:32.831 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:31:17 -0400 (0:00:00.099) 0:22:32.930 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:31:17 -0400 (0:00:00.068) 0:22:32.998 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:31:18 -0400 (0:00:00.218) 0:22:33.217 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:31:18 -0400 (0:00:00.127) 0:22:33.345 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:31:18 -0400 (0:00:00.147) 0:22:33.492 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:31:21 -0400 (0:00:03.351) 0:22:36.844 ********** ok: [managed-node12] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:31:22 -0400 (0:00:00.182) 0:22:37.027 ********** ok: [managed-node12] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:31:22 -0400 (0:00:00.123) 0:22:37.150 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:31:27 -0400 (0:00:04.996) 0:22:42.147 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:31:27 -0400 (0:00:00.146) 0:22:42.294 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:31:27 -0400 (0:00:00.120) 0:22:42.414 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:31:27 -0400 (0:00:00.068) 0:22:42.483 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:31:27 -0400 (0:00:00.039) 0:22:42.522 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:31:30 -0400 (0:00:02.877) 0:22:45.400 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:31:32 -0400 (0:00:02.349) 0:22:47.750 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:31:32 -0400 (0:00:00.135) 0:22:47.886 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:31:46 -0400 (0:00:13.525) 0:23:01.411 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:31:46 -0400 (0:00:00.091) 0:23:01.503 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472169.0538545, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1776472169.0488544, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776472169.0488544, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:31:47 -0400 (0:00:01.183) 0:23:02.686 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:31:48 -0400 (0:00:01.137) 0:23:03.824 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:31:49 -0400 (0:00:00.239) 0:23:04.063 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:31:49 -0400 (0:00:00.178) 0:23:04.242 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:31:49 -0400 (0:00:00.180) 0:23:04.423 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:31:49 -0400 (0:00:00.158) 0:23:04.582 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:31:50 -0400 (0:00:01.100) 0:23:05.682 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:31:52 -0400 (0:00:01.498) 0:23:07.181 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:31:53 -0400 (0:00:01.633) 0:23:08.814 ********** skipping: [managed-node12] => (item={'src': '/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:31:54 -0400 (0:00:00.277) 0:23:09.092 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:31:55 -0400 (0:00:01.683) 0:23:10.776 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472182.3379078, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776472174.164875, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 117440724, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776472174.1638749, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "165917832", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:31:57 -0400 (0:00:01.514) 0:23:12.290 ********** changed: [managed-node12] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3731fa37-6840-411c-b129-58aaccae3623', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:31:58 -0400 (0:00:01.181) 0:23:13.472 ********** ok: [managed-node12] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:513 Friday 17 April 2026 20:32:00 -0400 (0:00:01.846) 0:23:15.318 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:32:00 -0400 (0:00:00.400) 0:23:15.719 ********** ok: [managed-node12] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:32:00 -0400 (0:00:00.136) 0:23:15.856 ********** skipping: [managed-node12] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:32:01 -0400 (0:00:00.174) 0:23:16.031 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "3731fa37-6840-411c-b129-58aaccae3623" }, "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "size": "4G", "type": "crypt", "uuid": "772271ff-1147-4973-b669-ace338811e63" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:32:02 -0400 (0:00:01.175) 0:23:17.206 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002703", "end": "2026-04-17 20:32:03.308684", "rc": 0, "start": "2026-04-17 20:32:03.305981" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:32:03 -0400 (0:00:01.423) 0:23:18.630 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002790", "end": "2026-04-17 20:32:04.651483", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:32:04.648693" } STDOUT: luks-3731fa37-6840-411c-b129-58aaccae3623 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:32:04 -0400 (0:00:01.277) 0:23:19.908 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node12 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 April 2026 20:32:05 -0400 (0:00:00.290) 0:23:20.198 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 April 2026 20:32:05 -0400 (0:00:00.169) 0:23:20.368 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023533", "end": "2026-04-17 20:32:06.808380", "rc": 0, "start": "2026-04-17 20:32:06.784847" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 April 2026 20:32:07 -0400 (0:00:01.780) 0:23:22.149 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 April 2026 20:32:07 -0400 (0:00:00.264) 0:23:22.414 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 April 2026 20:32:07 -0400 (0:00:00.383) 0:23:22.797 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 April 2026 20:32:08 -0400 (0:00:00.471) 0:23:23.268 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 April 2026 20:32:09 -0400 (0:00:01.545) 0:23:24.813 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 April 2026 20:32:10 -0400 (0:00:00.280) 0:23:25.094 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 April 2026 20:32:10 -0400 (0:00:00.315) 0:23:25.409 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 April 2026 20:32:10 -0400 (0:00:00.347) 0:23:25.756 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 April 2026 20:32:11 -0400 (0:00:00.272) 0:23:26.029 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 April 2026 20:32:11 -0400 (0:00:00.330) 0:23:26.359 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 17 April 2026 20:32:11 -0400 (0:00:00.481) 0:23:26.840 ********** ok: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 17 April 2026 20:32:12 -0400 (0:00:00.360) 0:23:27.201 ********** ok: [managed-node12] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.96 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 17 April 2026 20:32:13 -0400 (0:00:01.490) 0:23:28.692 ********** skipping: [managed-node12] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 17 April 2026 20:32:13 -0400 (0:00:00.263) 0:23:28.955 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node12 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 April 2026 20:32:14 -0400 (0:00:00.462) 0:23:29.418 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 April 2026 20:32:14 -0400 (0:00:00.290) 0:23:29.708 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 April 2026 20:32:14 -0400 (0:00:00.166) 0:23:29.874 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 April 2026 20:32:15 -0400 (0:00:00.169) 0:23:30.044 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 April 2026 20:32:15 -0400 (0:00:00.177) 0:23:30.222 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 April 2026 20:32:15 -0400 (0:00:00.221) 0:23:30.443 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 April 2026 20:32:15 -0400 (0:00:00.266) 0:23:30.710 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 April 2026 20:32:15 -0400 (0:00:00.151) 0:23:30.862 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 April 2026 20:32:16 -0400 (0:00:00.324) 0:23:31.186 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 April 2026 20:32:16 -0400 (0:00:00.249) 0:23:31.436 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 April 2026 20:32:16 -0400 (0:00:00.240) 0:23:31.676 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 17 April 2026 20:32:16 -0400 (0:00:00.272) 0:23:31.948 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node12 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 April 2026 20:32:17 -0400 (0:00:00.396) 0:23:32.344 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node12 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 April 2026 20:32:17 -0400 (0:00:00.328) 0:23:32.673 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 April 2026 20:32:17 -0400 (0:00:00.243) 0:23:32.916 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 April 2026 20:32:18 -0400 (0:00:00.254) 0:23:33.171 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 April 2026 20:32:18 -0400 (0:00:00.362) 0:23:33.533 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 April 2026 20:32:18 -0400 (0:00:00.317) 0:23:33.851 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 April 2026 20:32:19 -0400 (0:00:00.301) 0:23:34.152 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 April 2026 20:32:19 -0400 (0:00:00.106) 0:23:34.259 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 17 April 2026 20:32:19 -0400 (0:00:00.157) 0:23:34.417 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node12 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 April 2026 20:32:19 -0400 (0:00:00.356) 0:23:34.779 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node12 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 April 2026 20:32:20 -0400 (0:00:00.356) 0:23:35.135 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 April 2026 20:32:20 -0400 (0:00:00.184) 0:23:35.320 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 April 2026 20:32:20 -0400 (0:00:00.163) 0:23:35.484 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 April 2026 20:32:20 -0400 (0:00:00.173) 0:23:35.657 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 17 April 2026 20:32:20 -0400 (0:00:00.172) 0:23:35.830 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node12 TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 April 2026 20:32:21 -0400 (0:00:00.359) 0:23:36.190 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 April 2026 20:32:21 -0400 (0:00:00.263) 0:23:36.453 ********** skipping: [managed-node12] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 April 2026 20:32:21 -0400 (0:00:00.232) 0:23:36.686 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node12 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 April 2026 20:32:21 -0400 (0:00:00.312) 0:23:36.998 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 April 2026 20:32:22 -0400 (0:00:00.217) 0:23:37.216 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 April 2026 20:32:22 -0400 (0:00:00.236) 0:23:37.453 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 April 2026 20:32:22 -0400 (0:00:00.133) 0:23:37.586 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 April 2026 20:32:22 -0400 (0:00:00.220) 0:23:37.807 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 April 2026 20:32:23 -0400 (0:00:00.253) 0:23:38.061 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 April 2026 20:32:23 -0400 (0:00:00.216) 0:23:38.277 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 17 April 2026 20:32:23 -0400 (0:00:00.107) 0:23:38.384 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node12 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 April 2026 20:32:23 -0400 (0:00:00.419) 0:23:38.803 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node12 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 April 2026 20:32:24 -0400 (0:00:00.435) 0:23:39.238 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 April 2026 20:32:24 -0400 (0:00:00.298) 0:23:39.537 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 April 2026 20:32:24 -0400 (0:00:00.268) 0:23:39.806 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 April 2026 20:32:25 -0400 (0:00:00.337) 0:23:40.143 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 April 2026 20:32:25 -0400 (0:00:00.266) 0:23:40.409 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 April 2026 20:32:25 -0400 (0:00:00.344) 0:23:40.754 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 April 2026 20:32:26 -0400 (0:00:00.326) 0:23:41.081 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 17 April 2026 20:32:26 -0400 (0:00:00.293) 0:23:41.374 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node12 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 April 2026 20:32:26 -0400 (0:00:00.549) 0:23:41.924 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 April 2026 20:32:27 -0400 (0:00:00.164) 0:23:42.089 ********** skipping: [managed-node12] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 17 April 2026 20:32:27 -0400 (0:00:00.177) 0:23:42.267 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 17 April 2026 20:32:27 -0400 (0:00:00.233) 0:23:42.500 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 17 April 2026 20:32:27 -0400 (0:00:00.142) 0:23:42.643 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 17 April 2026 20:32:27 -0400 (0:00:00.166) 0:23:42.809 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 17 April 2026 20:32:28 -0400 (0:00:00.229) 0:23:43.039 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 17 April 2026 20:32:28 -0400 (0:00:00.258) 0:23:43.297 ********** ok: [managed-node12] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 April 2026 20:32:28 -0400 (0:00:00.264) 0:23:43.561 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:32:28 -0400 (0:00:00.435) 0:23:43.996 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:32:30 -0400 (0:00:01.166) 0:23:45.163 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:32:31 -0400 (0:00:01.245) 0:23:46.409 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:32:31 -0400 (0:00:00.189) 0:23:46.598 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:32:31 -0400 (0:00:00.234) 0:23:46.832 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:32:32 -0400 (0:00:00.353) 0:23:47.186 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:32:32 -0400 (0:00:00.207) 0:23:47.393 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:32:32 -0400 (0:00:00.151) 0:23:47.545 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:32:32 -0400 (0:00:00.156) 0:23:47.701 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:32:32 -0400 (0:00:00.170) 0:23:47.871 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:32:33 -0400 (0:00:00.185) 0:23:48.057 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:32:33 -0400 (0:00:00.171) 0:23:48.229 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:32:33 -0400 (0:00:00.271) 0:23:48.500 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:32:33 -0400 (0:00:00.183) 0:23:48.683 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:32:34 -0400 (0:00:00.510) 0:23:49.194 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:32:34 -0400 (0:00:00.249) 0:23:49.444 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:32:34 -0400 (0:00:00.212) 0:23:49.656 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:32:34 -0400 (0:00:00.201) 0:23:49.857 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:32:34 -0400 (0:00:00.155) 0:23:50.013 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:32:35 -0400 (0:00:00.171) 0:23:50.184 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:32:35 -0400 (0:00:00.188) 0:23:50.373 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:32:35 -0400 (0:00:00.203) 0:23:50.577 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472305.9644058, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472305.9644058, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 314888, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472305.9644058, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:32:36 -0400 (0:00:01.231) 0:23:51.808 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:32:36 -0400 (0:00:00.190) 0:23:51.998 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:32:37 -0400 (0:00:00.285) 0:23:52.284 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:32:37 -0400 (0:00:00.248) 0:23:52.532 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:32:37 -0400 (0:00:00.145) 0:23:52.678 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:32:37 -0400 (0:00:00.192) 0:23:52.871 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:32:38 -0400 (0:00:00.246) 0:23:53.117 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472306.1274064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472306.1274064, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 334205, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776472306.1274064, "nlink": 1, "path": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:32:39 -0400 (0:00:01.417) 0:23:54.534 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:32:44 -0400 (0:00:04.604) 0:23:59.139 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010330", "end": "2026-04-17 20:32:45.067378", "rc": 0, "start": "2026-04-17 20:32:45.057048" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 3731fa37-6840-411c-b129-58aaccae3623 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 909207 Threads: 2 Salt: f6 47 df 74 24 14 4d 07 28 94 5c b5 1f 76 cb 76 16 b1 f2 23 6c 35 37 75 05 c1 b2 05 f9 11 8c a0 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: a6 ae 09 32 b7 91 64 35 18 51 f8 cd a4 34 00 60 0b fb 05 70 e3 5d ef 7b a2 36 51 61 d8 e1 72 47 Digest: a9 4d 41 76 4b 45 c4 18 84 ee 5b c7 1a 83 6b c2 34 34 10 a2 55 1a 76 09 b4 a2 60 97 ce b7 81 d7 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:32:45 -0400 (0:00:01.215) 0:24:00.354 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:32:45 -0400 (0:00:00.334) 0:24:00.689 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:32:45 -0400 (0:00:00.217) 0:24:00.906 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:32:46 -0400 (0:00:00.283) 0:24:01.190 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:32:46 -0400 (0:00:00.248) 0:24:01.439 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:32:46 -0400 (0:00:00.410) 0:24:01.850 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:32:47 -0400 (0:00:00.303) 0:24:02.153 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:32:47 -0400 (0:00:00.203) 0:24:02.357 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3731fa37-6840-411c-b129-58aaccae3623 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:32:47 -0400 (0:00:00.316) 0:24:02.673 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:32:47 -0400 (0:00:00.297) 0:24:02.970 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:32:48 -0400 (0:00:00.268) 0:24:03.239 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:32:48 -0400 (0:00:00.148) 0:24:03.387 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:32:48 -0400 (0:00:00.330) 0:24:03.718 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:32:49 -0400 (0:00:00.422) 0:24:04.140 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:32:49 -0400 (0:00:00.322) 0:24:04.463 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:32:49 -0400 (0:00:00.230) 0:24:04.694 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:32:49 -0400 (0:00:00.169) 0:24:04.863 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:32:50 -0400 (0:00:00.206) 0:24:05.070 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:32:50 -0400 (0:00:00.157) 0:24:05.228 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:32:50 -0400 (0:00:00.161) 0:24:05.389 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:32:50 -0400 (0:00:00.173) 0:24:05.562 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:32:50 -0400 (0:00:00.098) 0:24:05.661 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:32:50 -0400 (0:00:00.126) 0:24:05.787 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:32:50 -0400 (0:00:00.092) 0:24:05.880 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:32:51 -0400 (0:00:01.039) 0:24:06.919 ********** ok: [managed-node12] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:32:53 -0400 (0:00:01.432) 0:24:08.352 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:32:53 -0400 (0:00:00.240) 0:24:08.593 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:32:53 -0400 (0:00:00.085) 0:24:08.678 ********** ok: [managed-node12] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:32:55 -0400 (0:00:01.430) 0:24:10.108 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:32:55 -0400 (0:00:00.283) 0:24:10.392 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:32:55 -0400 (0:00:00.254) 0:24:10.647 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:32:55 -0400 (0:00:00.270) 0:24:10.917 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:32:56 -0400 (0:00:00.189) 0:24:11.106 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:32:56 -0400 (0:00:00.286) 0:24:11.393 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:32:56 -0400 (0:00:00.217) 0:24:11.610 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:32:56 -0400 (0:00:00.215) 0:24:11.825 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:32:57 -0400 (0:00:00.364) 0:24:12.189 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:32:57 -0400 (0:00:00.240) 0:24:12.430 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:32:57 -0400 (0:00:00.249) 0:24:12.680 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:32:57 -0400 (0:00:00.220) 0:24:12.900 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:32:58 -0400 (0:00:00.232) 0:24:13.133 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:32:58 -0400 (0:00:00.184) 0:24:13.318 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:32:58 -0400 (0:00:00.166) 0:24:13.485 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:32:58 -0400 (0:00:00.095) 0:24:13.580 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:32:58 -0400 (0:00:00.107) 0:24:13.687 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:32:58 -0400 (0:00:00.100) 0:24:13.788 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:32:58 -0400 (0:00:00.095) 0:24:13.883 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:32:59 -0400 (0:00:00.171) 0:24:14.055 ********** ok: [managed-node12] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:32:59 -0400 (0:00:00.125) 0:24:14.180 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:32:59 -0400 (0:00:00.197) 0:24:14.378 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:32:59 -0400 (0:00:00.280) 0:24:14.658 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023609", "end": "2026-04-17 20:33:00.750351", "rc": 0, "start": "2026-04-17 20:33:00.726742" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:33:01 -0400 (0:00:01.412) 0:24:16.071 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:33:01 -0400 (0:00:00.416) 0:24:16.488 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:33:01 -0400 (0:00:00.481) 0:24:16.969 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:33:02 -0400 (0:00:00.209) 0:24:17.179 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:33:02 -0400 (0:00:00.238) 0:24:17.418 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:33:02 -0400 (0:00:00.272) 0:24:17.690 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:33:02 -0400 (0:00:00.169) 0:24:17.860 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:33:03 -0400 (0:00:00.162) 0:24:18.022 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:33:03 -0400 (0:00:00.073) 0:24:18.096 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:516 Friday 17 April 2026 20:33:03 -0400 (0:00:00.133) 0:24:18.229 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node12 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Friday 17 April 2026 20:33:03 -0400 (0:00:00.367) 0:24:18.597 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Friday 17 April 2026 20:33:03 -0400 (0:00:00.223) 0:24:18.820 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 April 2026 20:33:04 -0400 (0:00:00.229) 0:24:19.050 ********** ok: [managed-node12] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Friday 17 April 2026 20:33:05 -0400 (0:00:01.732) 0:24:20.782 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 April 2026 20:33:06 -0400 (0:00:00.238) 0:24:21.021 ********** ok: [managed-node12] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 April 2026 20:33:07 -0400 (0:00:01.679) 0:24:22.700 ********** skipping: [managed-node12] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node12] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node12] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 April 2026 20:33:07 -0400 (0:00:00.278) 0:24:22.978 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 April 2026 20:33:08 -0400 (0:00:00.166) 0:24:23.145 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 April 2026 20:33:08 -0400 (0:00:00.161) 0:24:23.306 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 April 2026 20:33:08 -0400 (0:00:00.128) 0:24:23.435 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Friday 17 April 2026 20:33:08 -0400 (0:00:00.144) 0:24:23.579 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 April 2026 20:33:08 -0400 (0:00:00.374) 0:24:23.954 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Friday 17 April 2026 20:33:09 -0400 (0:00:00.164) 0:24:24.119 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Friday 17 April 2026 20:33:09 -0400 (0:00:00.139) 0:24:24.258 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Friday 17 April 2026 20:33:12 -0400 (0:00:03.632) 0:24:27.890 ********** ok: [managed-node12] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 17 April 2026 20:33:13 -0400 (0:00:00.214) 0:24:28.105 ********** ok: [managed-node12] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 April 2026 20:33:13 -0400 (0:00:00.234) 0:24:28.340 ********** ok: [managed-node12] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Friday 17 April 2026 20:33:18 -0400 (0:00:05.398) 0:24:33.739 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node12 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 April 2026 20:33:19 -0400 (0:00:00.327) 0:24:34.066 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 April 2026 20:33:19 -0400 (0:00:00.085) 0:24:34.152 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 April 2026 20:33:19 -0400 (0:00:00.197) 0:24:34.349 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Friday 17 April 2026 20:33:19 -0400 (0:00:00.128) 0:24:34.477 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Friday 17 April 2026 20:33:23 -0400 (0:00:04.007) 0:24:38.485 ********** ok: [managed-node12] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Friday 17 April 2026 20:33:26 -0400 (0:00:02.588) 0:24:41.074 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Friday 17 April 2026 20:33:26 -0400 (0:00:00.245) 0:24:41.320 ********** changed: [managed-node12] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Friday 17 April 2026 20:33:32 -0400 (0:00:06.105) 0:24:47.425 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Friday 17 April 2026 20:33:32 -0400 (0:00:00.330) 0:24:47.756 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472313.4954362, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "daeff7caf4675a438bd7a7722c1a3c4735629f3a", "ctime": 1776472313.4914362, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 425721992, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776472313.4914362, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2455742250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 17 April 2026 20:33:35 -0400 (0:00:02.438) 0:24:50.194 ********** ok: [managed-node12] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Friday 17 April 2026 20:33:36 -0400 (0:00:01.194) 0:24:51.389 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Friday 17 April 2026 20:33:36 -0400 (0:00:00.193) 0:24:51.583 ********** ok: [managed-node12] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 April 2026 20:33:36 -0400 (0:00:00.178) 0:24:51.762 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Friday 17 April 2026 20:33:36 -0400 (0:00:00.202) 0:24:51.964 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Friday 17 April 2026 20:33:37 -0400 (0:00:00.149) 0:24:52.114 ********** changed: [managed-node12] => (item={'src': '/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3731fa37-6840-411c-b129-58aaccae3623" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Friday 17 April 2026 20:33:38 -0400 (0:00:01.216) 0:24:53.331 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Friday 17 April 2026 20:33:39 -0400 (0:00:01.676) 0:24:55.007 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 April 2026 20:33:40 -0400 (0:00:00.286) 0:24:55.293 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Friday 17 April 2026 20:33:40 -0400 (0:00:00.275) 0:24:55.568 ********** ok: [managed-node12] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Friday 17 April 2026 20:33:42 -0400 (0:00:01.595) 0:24:57.163 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472324.649481, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "11a0c334d03ca675766bb2f740f41b2d9c3d4a2d", "ctime": 1776472318.2314553, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 289407173, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776472318.2304552, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "4171198076", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Friday 17 April 2026 20:33:43 -0400 (0:00:01.451) 0:24:58.615 ********** changed: [managed-node12] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-3731fa37-6840-411c-b129-58aaccae3623', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-3731fa37-6840-411c-b129-58aaccae3623", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Friday 17 April 2026 20:33:45 -0400 (0:00:01.656) 0:25:00.272 ********** ok: [managed-node12] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:525 Friday 17 April 2026 20:33:47 -0400 (0:00:01.861) 0:25:02.133 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node12 TASK [Print out pool information] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 April 2026 20:33:47 -0400 (0:00:00.374) 0:25:02.507 ********** skipping: [managed-node12] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 April 2026 20:33:47 -0400 (0:00:00.078) 0:25:02.586 ********** ok: [managed-node12] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=eej13c-kGjw-FkHJ-mwSL-iZme-3ovZ-QyKGP7", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 April 2026 20:33:47 -0400 (0:00:00.102) 0:25:02.688 ********** ok: [managed-node12] => { "changed": false, "info": { "/dev/loop0": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/loop0", "size": "", "type": "loop", "uuid": "" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 April 2026 20:33:49 -0400 (0:00:01.352) 0:25:04.041 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003249", "end": "2026-04-17 20:33:50.209953", "rc": 0, "start": "2026-04-17 20:33:50.206704" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 April 2026 20:33:50 -0400 (0:00:01.501) 0:25:05.543 ********** ok: [managed-node12] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002827", "end": "2026-04-17 20:33:51.459709", "failed_when_result": false, "rc": 0, "start": "2026-04-17 20:33:51.456882" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 April 2026 20:33:51 -0400 (0:00:01.188) 0:25:06.731 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 17 April 2026 20:33:51 -0400 (0:00:00.235) 0:25:06.966 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node12 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 April 2026 20:33:52 -0400 (0:00:00.392) 0:25:07.358 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 April 2026 20:33:52 -0400 (0:00:00.261) 0:25:07.620 ********** included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node12 included: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node12 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 April 2026 20:33:53 -0400 (0:00:00.935) 0:25:08.555 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 April 2026 20:33:53 -0400 (0:00:00.268) 0:25:08.823 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 April 2026 20:33:53 -0400 (0:00:00.165) 0:25:08.989 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 17 April 2026 20:33:54 -0400 (0:00:00.255) 0:25:09.244 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 17 April 2026 20:33:54 -0400 (0:00:00.157) 0:25:09.402 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 17 April 2026 20:33:54 -0400 (0:00:00.150) 0:25:09.552 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 17 April 2026 20:33:54 -0400 (0:00:00.171) 0:25:09.724 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 17 April 2026 20:33:54 -0400 (0:00:00.126) 0:25:09.851 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 17 April 2026 20:33:55 -0400 (0:00:00.251) 0:25:10.102 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 17 April 2026 20:33:55 -0400 (0:00:00.255) 0:25:10.358 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 17 April 2026 20:33:55 -0400 (0:00:00.187) 0:25:10.545 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 April 2026 20:33:55 -0400 (0:00:00.273) 0:25:10.819 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 April 2026 20:33:56 -0400 (0:00:00.546) 0:25:11.365 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 April 2026 20:33:56 -0400 (0:00:00.228) 0:25:11.593 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 April 2026 20:33:56 -0400 (0:00:00.191) 0:25:11.784 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 April 2026 20:33:57 -0400 (0:00:00.237) 0:25:12.021 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 April 2026 20:33:57 -0400 (0:00:00.246) 0:25:12.268 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 April 2026 20:33:57 -0400 (0:00:00.162) 0:25:12.431 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 April 2026 20:33:57 -0400 (0:00:00.302) 0:25:12.734 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 April 2026 20:33:57 -0400 (0:00:00.275) 0:25:13.009 ********** ok: [managed-node12] => { "changed": false, "stat": { "atime": 1776472411.9358327, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776472411.9358327, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36443, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776472411.9358327, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 April 2026 20:33:59 -0400 (0:00:01.562) 0:25:14.572 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 April 2026 20:33:59 -0400 (0:00:00.196) 0:25:14.768 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 April 2026 20:33:59 -0400 (0:00:00.112) 0:25:14.880 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 April 2026 20:34:00 -0400 (0:00:00.159) 0:25:15.040 ********** ok: [managed-node12] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 April 2026 20:34:00 -0400 (0:00:00.215) 0:25:15.256 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 April 2026 20:34:00 -0400 (0:00:00.123) 0:25:15.380 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 April 2026 20:34:00 -0400 (0:00:00.135) 0:25:15.515 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 April 2026 20:34:00 -0400 (0:00:00.177) 0:25:15.693 ********** ok: [managed-node12] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 April 2026 20:34:04 -0400 (0:00:03.560) 0:25:19.254 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 April 2026 20:34:04 -0400 (0:00:00.233) 0:25:19.487 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 April 2026 20:34:04 -0400 (0:00:00.111) 0:25:19.599 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 April 2026 20:34:04 -0400 (0:00:00.184) 0:25:19.783 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 April 2026 20:34:04 -0400 (0:00:00.156) 0:25:19.940 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 April 2026 20:34:05 -0400 (0:00:00.224) 0:25:20.164 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 17 April 2026 20:34:05 -0400 (0:00:00.148) 0:25:20.313 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 17 April 2026 20:34:05 -0400 (0:00:00.125) 0:25:20.438 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 17 April 2026 20:34:05 -0400 (0:00:00.146) 0:25:20.585 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 17 April 2026 20:34:05 -0400 (0:00:00.149) 0:25:20.734 ********** ok: [managed-node12] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 17 April 2026 20:34:05 -0400 (0:00:00.172) 0:25:20.907 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 17 April 2026 20:34:06 -0400 (0:00:00.225) 0:25:21.132 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 17 April 2026 20:34:06 -0400 (0:00:00.254) 0:25:21.387 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 17 April 2026 20:34:06 -0400 (0:00:00.257) 0:25:21.645 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 April 2026 20:34:06 -0400 (0:00:00.228) 0:25:21.873 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 April 2026 20:34:07 -0400 (0:00:00.245) 0:25:22.118 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 April 2026 20:34:07 -0400 (0:00:00.158) 0:25:22.278 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 April 2026 20:34:07 -0400 (0:00:00.254) 0:25:22.532 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 April 2026 20:34:07 -0400 (0:00:00.122) 0:25:22.655 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 April 2026 20:34:07 -0400 (0:00:00.282) 0:25:22.938 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 April 2026 20:34:08 -0400 (0:00:00.222) 0:25:23.160 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 April 2026 20:34:08 -0400 (0:00:00.181) 0:25:23.341 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 April 2026 20:34:08 -0400 (0:00:00.241) 0:25:23.582 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 April 2026 20:34:08 -0400 (0:00:00.226) 0:25:23.808 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 April 2026 20:34:09 -0400 (0:00:00.211) 0:25:24.020 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 April 2026 20:34:09 -0400 (0:00:00.196) 0:25:24.216 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 April 2026 20:34:09 -0400 (0:00:00.134) 0:25:24.351 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 April 2026 20:34:09 -0400 (0:00:00.109) 0:25:24.460 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 April 2026 20:34:09 -0400 (0:00:00.065) 0:25:24.526 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 April 2026 20:34:09 -0400 (0:00:00.202) 0:25:24.729 ********** skipping: [managed-node12] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 April 2026 20:34:09 -0400 (0:00:00.142) 0:25:24.871 ********** skipping: [managed-node12] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 April 2026 20:34:10 -0400 (0:00:00.155) 0:25:25.027 ********** skipping: [managed-node12] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 April 2026 20:34:10 -0400 (0:00:00.152) 0:25:25.180 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 17 April 2026 20:34:10 -0400 (0:00:00.227) 0:25:25.407 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 17 April 2026 20:34:10 -0400 (0:00:00.170) 0:25:25.578 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 17 April 2026 20:34:10 -0400 (0:00:00.262) 0:25:25.841 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 17 April 2026 20:34:11 -0400 (0:00:00.248) 0:25:26.090 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 17 April 2026 20:34:11 -0400 (0:00:00.162) 0:25:26.252 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 17 April 2026 20:34:11 -0400 (0:00:00.245) 0:25:26.498 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 17 April 2026 20:34:11 -0400 (0:00:00.211) 0:25:26.709 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 17 April 2026 20:34:11 -0400 (0:00:00.161) 0:25:26.871 ********** skipping: [managed-node12] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 17 April 2026 20:34:12 -0400 (0:00:00.178) 0:25:27.050 ********** skipping: [managed-node12] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 17 April 2026 20:34:12 -0400 (0:00:00.121) 0:25:27.171 ********** skipping: [managed-node12] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 17 April 2026 20:34:12 -0400 (0:00:00.235) 0:25:27.407 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 17 April 2026 20:34:12 -0400 (0:00:00.159) 0:25:27.566 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 17 April 2026 20:34:12 -0400 (0:00:00.207) 0:25:27.774 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 17 April 2026 20:34:12 -0400 (0:00:00.128) 0:25:27.902 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 17 April 2026 20:34:12 -0400 (0:00:00.102) 0:25:28.005 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 17 April 2026 20:34:13 -0400 (0:00:00.202) 0:25:28.208 ********** ok: [managed-node12] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 17 April 2026 20:34:13 -0400 (0:00:00.291) 0:25:28.499 ********** ok: [managed-node12] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 17 April 2026 20:34:13 -0400 (0:00:00.285) 0:25:28.785 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 April 2026 20:34:13 -0400 (0:00:00.202) 0:25:28.987 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 April 2026 20:34:14 -0400 (0:00:00.287) 0:25:29.275 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 April 2026 20:34:14 -0400 (0:00:00.242) 0:25:29.518 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 April 2026 20:34:14 -0400 (0:00:00.264) 0:25:29.782 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 April 2026 20:34:15 -0400 (0:00:00.247) 0:25:30.030 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 April 2026 20:34:15 -0400 (0:00:00.238) 0:25:30.269 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 April 2026 20:34:15 -0400 (0:00:00.311) 0:25:30.580 ********** skipping: [managed-node12] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 April 2026 20:34:15 -0400 (0:00:00.235) 0:25:30.816 ********** ok: [managed-node12] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 17 April 2026 20:34:15 -0400 (0:00:00.096) 0:25:30.912 ********** ok: [managed-node12] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node12 : ok=1271 changed=60 unreachable=0 failed=9 skipped=1110 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:10:06.671752+00:00Z", "host": "managed-node12", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-18T00:10:01.558072+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:10:06.958726+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:10:06.716792+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:12:20.656499+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac' in safe mode due to encryption removal", "start_time": "2026-04-18T00:12:15.151329+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:12:21.020637+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-5d65d7e3-39d1-4eab-ace2-6c0e6b24faac' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:12:20.677880+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:14:18.281469+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-18T00:14:13.089450+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:14:18.508657+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:14:18.288598+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:16:20.801586+00:00Z", "host": "managed-node12", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-18T00:16:15.183625+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:16:21.144521+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:16:20.882750+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:18:52.286765+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f' in safe mode due to encryption removal", "start_time": "2026-04-18T00:18:47.357124+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:18:52.466530+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-814d1f35-ac59-4428-83ec-6f7b65c1154f' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:18:52.314626+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:21:19.564948+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-18T00:21:14.143198+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:21:19.751189+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:21:19.572688+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:23:52.296760+00:00Z", "host": "managed-node12", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-18T00:23:46.780293+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:23:52.449321+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:23:52.303590+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:28:34.422313+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1' in safe mode due to encryption removal", "start_time": "2026-04-18T00:28:28.276824+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:28:34.784691+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-3d28318b-b762-4e84-b02d-ee21d873f8c1' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:28:34.479768+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:31:09.874915+00:00Z", "host": "managed-node12", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-18T00:31:04.622228+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-18T00:31:10.093600+00:00Z", "host": "managed-node12", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-18T00:31:09.909373+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 17 April 2026 20:34:16 -0400 (0:00:00.227) 0:25:31.139 ********** =============================================================================== fedora.linux_system_roles.storage : Record storage role fingerprint in syslog -- 35.53s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.57s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.54s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.99s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.60s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.53s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.47s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 9.31s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.31s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.20s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.11s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 6.01s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.95s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.93s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.75s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.75s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.74s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.71s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.68s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.67s /tmp/collections-PtH/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88