ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Monday 20 April 2026 05:03:39 -0400 (0:00:00.335) 0:00:00.335 ********** ok: [managed-node2] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Monday 20 April 2026 05:03:43 -0400 (0:00:04.437) 0:00:04.773 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Monday 20 April 2026 05:03:44 -0400 (0:00:00.568) 0:00:05.341 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Monday 20 April 2026 05:03:44 -0400 (0:00:00.489) 0:00:05.830 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Monday 20 April 2026 05:03:45 -0400 (0:00:00.428) 0:00:06.259 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Monday 20 April 2026 05:03:45 -0400 (0:00:00.254) 0:00:06.514 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Monday 20 April 2026 05:03:45 -0400 (0:00:00.407) 0:00:06.922 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Monday 20 April 2026 05:03:46 -0400 (0:00:00.318) 0:00:07.240 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Monday 20 April 2026 05:03:46 -0400 (0:00:00.328) 0:00:07.568 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:03:46 -0400 (0:00:00.264) 0:00:07.832 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:03:46 -0400 (0:00:00.188) 0:00:08.021 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:03:47 -0400 (0:00:00.581) 0:00:08.602 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:03:49 -0400 (0:00:01.732) 0:00:10.335 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:03:49 -0400 (0:00:00.208) 0:00:10.543 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:03:51 -0400 (0:00:01.797) 0:00:12.341 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:03:51 -0400 (0:00:00.453) 0:00:12.794 ********** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:03:54 -0400 (0:00:03.200) 0:00:15.995 ********** ok: [managed-node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:03:55 -0400 (0:00:00.248) 0:00:16.243 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:03:55 -0400 (0:00:00.159) 0:00:16.402 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:03:55 -0400 (0:00:00.205) 0:00:16.608 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:03:56 -0400 (0:00:00.848) 0:00:17.456 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:03:56 -0400 (0:00:00.236) 0:00:17.693 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:03:56 -0400 (0:00:00.164) 0:00:17.857 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:04:01 -0400 (0:00:05.070) 0:00:22.928 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:04:02 -0400 (0:00:00.216) 0:00:23.145 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:04:02 -0400 (0:00:00.253) 0:00:23.398 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:04:05 -0400 (0:00:02.920) 0:00:26.319 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:04:05 -0400 (0:00:00.361) 0:00:26.681 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:04:05 -0400 (0:00:00.160) 0:00:26.841 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:04:05 -0400 (0:00:00.238) 0:00:27.080 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:04:06 -0400 (0:00:00.136) 0:00:27.216 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:04:09 -0400 (0:00:03.761) 0:00:30.978 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:04:13 -0400 (0:00:03.386) 0:00:34.364 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:04:13 -0400 (0:00:00.202) 0:00:34.567 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:04:15 -0400 (0:00:01.774) 0:00:36.342 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:04:15 -0400 (0:00:00.134) 0:00:36.476 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675508.8397214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776675507.2897248, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776675507.2897248, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:04:16 -0400 (0:00:01.200) 0:00:37.677 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:04:16 -0400 (0:00:00.338) 0:00:38.015 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:04:17 -0400 (0:00:00.355) 0:00:38.371 ********** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:04:17 -0400 (0:00:00.277) 0:00:38.648 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:04:17 -0400 (0:00:00.251) 0:00:38.899 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:04:17 -0400 (0:00:00.222) 0:00:39.121 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:04:18 -0400 (0:00:00.214) 0:00:39.336 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:04:18 -0400 (0:00:00.253) 0:00:39.589 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:04:18 -0400 (0:00:00.212) 0:00:39.802 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:04:18 -0400 (0:00:00.234) 0:00:40.036 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:04:19 -0400 (0:00:00.155) 0:00:40.191 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776674691.7636418, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:04:20 -0400 (0:00:01.598) 0:00:41.790 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:04:20 -0400 (0:00:00.230) 0:00:42.021 ********** ok: [managed-node2] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:75 Monday 20 April 2026 05:04:22 -0400 (0:00:01.872) 0:00:43.893 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Monday 20 April 2026 05:04:23 -0400 (0:00:00.262) 0:00:44.156 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Monday 20 April 2026 05:04:27 -0400 (0:00:04.242) 0:00:48.398 ********** ok: [managed-node2] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Monday 20 April 2026 05:04:29 -0400 (0:00:02.576) 0:00:50.975 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Monday 20 April 2026 05:04:30 -0400 (0:00:00.294) 0:00:51.270 ********** ok: [managed-node2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Monday 20 April 2026 05:04:30 -0400 (0:00:00.263) 0:00:51.533 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Monday 20 April 2026 05:04:30 -0400 (0:00:00.152) 0:00:51.685 ********** ok: [managed-node2] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:84 Monday 20 April 2026 05:04:30 -0400 (0:00:00.136) 0:00:51.821 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:04:30 -0400 (0:00:00.252) 0:00:52.074 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:04:31 -0400 (0:00:00.132) 0:00:52.207 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:04:31 -0400 (0:00:00.173) 0:00:52.380 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:04:31 -0400 (0:00:00.098) 0:00:52.479 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:04:31 -0400 (0:00:00.252) 0:00:52.731 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:04:32 -0400 (0:00:01.322) 0:00:54.053 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:04:33 -0400 (0:00:00.149) 0:00:54.203 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:04:34 -0400 (0:00:01.406) 0:00:55.610 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:04:34 -0400 (0:00:00.390) 0:00:56.000 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:04:35 -0400 (0:00:00.211) 0:00:56.212 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:04:35 -0400 (0:00:00.226) 0:00:56.438 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:04:35 -0400 (0:00:00.137) 0:00:56.576 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:04:35 -0400 (0:00:00.180) 0:00:56.756 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:04:35 -0400 (0:00:00.346) 0:00:57.103 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:04:36 -0400 (0:00:00.154) 0:00:57.258 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:04:36 -0400 (0:00:00.172) 0:00:57.431 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:04:40 -0400 (0:00:04.037) 0:01:01.469 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:04:40 -0400 (0:00:00.139) 0:01:01.608 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:04:40 -0400 (0:00:00.253) 0:01:01.862 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:04:45 -0400 (0:00:04.783) 0:01:06.645 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:04:45 -0400 (0:00:00.267) 0:01:06.913 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:04:45 -0400 (0:00:00.110) 0:01:07.024 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:04:46 -0400 (0:00:00.131) 0:01:07.155 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:04:46 -0400 (0:00:00.071) 0:01:07.227 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:04:49 -0400 (0:00:03.846) 0:01:11.073 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:04:52 -0400 (0:00:02.984) 0:01:14.058 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:04:53 -0400 (0:00:00.388) 0:01:14.447 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:04:58 -0400 (0:00:05.195) 0:01:19.642 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:04:58 -0400 (0:00:00.155) 0:01:19.798 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:04:58 -0400 (0:00:00.272) 0:01:20.070 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:04:59 -0400 (0:00:00.204) 0:01:20.275 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:04:59 -0400 (0:00:00.449) 0:01:20.725 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:100 Monday 20 April 2026 05:04:59 -0400 (0:00:00.167) 0:01:20.893 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:05:00 -0400 (0:00:00.421) 0:01:21.314 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:05:00 -0400 (0:00:00.249) 0:01:21.564 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:05:00 -0400 (0:00:00.259) 0:01:21.823 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:05:02 -0400 (0:00:01.469) 0:01:23.293 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:05:02 -0400 (0:00:00.136) 0:01:23.429 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:05:04 -0400 (0:00:01.879) 0:01:25.309 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:05:04 -0400 (0:00:00.511) 0:01:25.820 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:05:04 -0400 (0:00:00.258) 0:01:26.078 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:05:05 -0400 (0:00:00.258) 0:01:26.337 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:05:05 -0400 (0:00:00.176) 0:01:26.514 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:05:05 -0400 (0:00:00.182) 0:01:26.696 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:05:05 -0400 (0:00:00.434) 0:01:27.131 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:05:06 -0400 (0:00:00.236) 0:01:27.367 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:05:06 -0400 (0:00:00.179) 0:01:27.546 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:05:10 -0400 (0:00:04.198) 0:01:31.745 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:05:10 -0400 (0:00:00.199) 0:01:31.945 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:05:10 -0400 (0:00:00.162) 0:01:32.108 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:05:15 -0400 (0:00:04.795) 0:01:36.903 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:05:15 -0400 (0:00:00.210) 0:01:37.114 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:05:16 -0400 (0:00:00.233) 0:01:37.347 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:05:16 -0400 (0:00:00.206) 0:01:37.554 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:05:16 -0400 (0:00:00.137) 0:01:37.691 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:05:20 -0400 (0:00:04.193) 0:01:41.885 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:05:23 -0400 (0:00:02.776) 0:01:44.661 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:05:23 -0400 (0:00:00.170) 0:01:44.832 ********** changed: [managed-node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:05:37 -0400 (0:00:13.349) 0:01:58.181 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:05:37 -0400 (0:00:00.166) 0:01:58.347 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675508.8397214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776675507.2897248, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776675507.2897248, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:05:38 -0400 (0:00:01.245) 0:01:59.593 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:05:40 -0400 (0:00:01.672) 0:02:01.266 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:05:40 -0400 (0:00:00.069) 0:02:01.335 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:05:40 -0400 (0:00:00.097) 0:02:01.432 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:05:40 -0400 (0:00:00.074) 0:02:01.506 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:05:40 -0400 (0:00:00.202) 0:02:01.709 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:05:40 -0400 (0:00:00.193) 0:02:01.902 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:05:44 -0400 (0:00:04.187) 0:02:06.090 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:05:47 -0400 (0:00:02.063) 0:02:08.154 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:05:47 -0400 (0:00:00.284) 0:02:08.439 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:05:48 -0400 (0:00:01.701) 0:02:10.140 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776674691.7636418, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:05:50 -0400 (0:00:01.366) 0:02:11.506 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:05:51 -0400 (0:00:01.436) 0:02:12.942 ********** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:112 Monday 20 April 2026 05:05:53 -0400 (0:00:01.747) 0:02:14.690 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:05:53 -0400 (0:00:00.182) 0:02:14.873 ********** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:05:53 -0400 (0:00:00.086) 0:02:14.960 ********** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:05:53 -0400 (0:00:00.152) 0:02:15.113 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "size": "10G", "type": "crypt", "uuid": "bdd6f842-6d7d-457a-8cea-8fa0ec8be9e9" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "00a7163b-2630-4914-9eff-8d6f78b6405b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:05:56 -0400 (0:00:02.339) 0:02:17.452 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002216", "end": "2026-04-20 05:05:58.161314", "rc": 0, "start": "2026-04-20 05:05:58.159098" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:05:58 -0400 (0:00:02.109) 0:02:19.562 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002255", "end": "2026-04-20 05:05:59.559060", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:05:59.556805" } STDOUT: luks-00a7163b-2630-4914-9eff-8d6f78b6405b /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:05:59 -0400 (0:00:01.394) 0:02:20.956 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:05:59 -0400 (0:00:00.149) 0:02:21.106 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:06:00 -0400 (0:00:00.419) 0:02:21.526 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:06:00 -0400 (0:00:00.211) 0:02:21.738 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:06:01 -0400 (0:00:00.921) 0:02:22.660 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:06:01 -0400 (0:00:00.159) 0:02:22.819 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:06:01 -0400 (0:00:00.117) 0:02:22.937 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:06:02 -0400 (0:00:00.228) 0:02:23.165 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:06:02 -0400 (0:00:00.202) 0:02:23.368 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:06:02 -0400 (0:00:00.220) 0:02:23.589 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:06:02 -0400 (0:00:00.238) 0:02:23.828 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:06:02 -0400 (0:00:00.233) 0:02:24.061 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:06:03 -0400 (0:00:00.131) 0:02:24.193 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:06:03 -0400 (0:00:00.146) 0:02:24.339 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:06:03 -0400 (0:00:00.210) 0:02:24.550 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:06:03 -0400 (0:00:00.076) 0:02:24.626 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:06:03 -0400 (0:00:00.264) 0:02:24.890 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:06:04 -0400 (0:00:00.319) 0:02:25.210 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:06:04 -0400 (0:00:00.183) 0:02:25.393 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:06:04 -0400 (0:00:00.168) 0:02:25.562 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:06:04 -0400 (0:00:00.195) 0:02:25.758 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:06:04 -0400 (0:00:00.132) 0:02:25.890 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:06:05 -0400 (0:00:00.251) 0:02:26.142 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:06:05 -0400 (0:00:00.279) 0:02:26.421 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675936.6707592, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776675936.6707592, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37259, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776675936.6707592, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:06:06 -0400 (0:00:01.161) 0:02:27.583 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:06:06 -0400 (0:00:00.146) 0:02:27.729 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:06:06 -0400 (0:00:00.086) 0:02:27.816 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:06:06 -0400 (0:00:00.094) 0:02:27.910 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:06:06 -0400 (0:00:00.068) 0:02:27.979 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:06:06 -0400 (0:00:00.082) 0:02:28.061 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:06:07 -0400 (0:00:00.086) 0:02:28.147 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675936.8007588, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776675936.8007588, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 172326, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776675936.8007588, "nlink": 1, "path": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:06:07 -0400 (0:00:00.982) 0:02:29.129 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:06:11 -0400 (0:00:03.827) 0:02:32.957 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012496", "end": "2026-04-20 05:06:13.114493", "rc": 0, "start": "2026-04-20 05:06:13.101997" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 00a7163b-2630-4914-9eff-8d6f78b6405b Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 916121 Threads: 2 Salt: 05 65 29 1e cf f1 46 f2 53 38 6e d0 dd bb 1a 64 c1 82 f0 bd fd 0c 7d 24 75 78 74 65 c1 09 b9 c8 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: ae 7f aa 70 22 61 f1 a5 7e 78 30 2d 91 ff 65 1e 54 b6 9b c0 ad a9 0d 25 9b cc 05 34 61 02 5a 33 Digest: 0f 0e e4 2b b3 9f ac ec 7f ad a1 dc 6f 33 46 76 b7 0d ea 5f 3c d0 99 d5 fd 92 21 9e 5b 46 93 53 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:06:13 -0400 (0:00:01.541) 0:02:34.499 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:06:13 -0400 (0:00:00.330) 0:02:34.829 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:06:14 -0400 (0:00:00.330) 0:02:35.160 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:06:14 -0400 (0:00:00.272) 0:02:35.432 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:06:14 -0400 (0:00:00.186) 0:02:35.619 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:06:14 -0400 (0:00:00.264) 0:02:35.884 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:06:14 -0400 (0:00:00.133) 0:02:36.018 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:06:15 -0400 (0:00:00.169) 0:02:36.188 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-00a7163b-2630-4914-9eff-8d6f78b6405b /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:06:15 -0400 (0:00:00.174) 0:02:36.363 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:06:15 -0400 (0:00:00.202) 0:02:36.565 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:06:15 -0400 (0:00:00.379) 0:02:36.945 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:06:16 -0400 (0:00:00.277) 0:02:37.222 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:06:16 -0400 (0:00:00.289) 0:02:37.512 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:06:16 -0400 (0:00:00.255) 0:02:37.767 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:06:16 -0400 (0:00:00.348) 0:02:38.116 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:06:17 -0400 (0:00:00.176) 0:02:38.292 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:06:17 -0400 (0:00:00.363) 0:02:38.655 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:06:17 -0400 (0:00:00.195) 0:02:38.850 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:06:18 -0400 (0:00:00.305) 0:02:39.156 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:06:18 -0400 (0:00:00.312) 0:02:39.469 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:06:18 -0400 (0:00:00.309) 0:02:39.778 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:06:18 -0400 (0:00:00.326) 0:02:40.105 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:06:19 -0400 (0:00:00.237) 0:02:40.342 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:06:19 -0400 (0:00:00.221) 0:02:40.563 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:06:19 -0400 (0:00:00.182) 0:02:40.746 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:06:19 -0400 (0:00:00.203) 0:02:40.950 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:06:20 -0400 (0:00:00.343) 0:02:41.293 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:06:20 -0400 (0:00:00.290) 0:02:41.584 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:06:20 -0400 (0:00:00.258) 0:02:41.843 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:06:20 -0400 (0:00:00.221) 0:02:42.064 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:06:21 -0400 (0:00:00.192) 0:02:42.257 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:06:21 -0400 (0:00:00.332) 0:02:42.589 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:06:21 -0400 (0:00:00.224) 0:02:42.813 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:06:21 -0400 (0:00:00.281) 0:02:43.095 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:06:22 -0400 (0:00:00.213) 0:02:43.308 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:06:22 -0400 (0:00:00.214) 0:02:43.522 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:06:22 -0400 (0:00:00.289) 0:02:43.812 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:06:22 -0400 (0:00:00.291) 0:02:44.104 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:06:23 -0400 (0:00:00.240) 0:02:44.345 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:06:23 -0400 (0:00:00.245) 0:02:44.590 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:06:23 -0400 (0:00:00.307) 0:02:44.898 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:06:23 -0400 (0:00:00.217) 0:02:45.115 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:06:24 -0400 (0:00:00.159) 0:02:45.275 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:06:24 -0400 (0:00:00.216) 0:02:45.491 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:06:24 -0400 (0:00:00.176) 0:02:45.668 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:06:24 -0400 (0:00:00.205) 0:02:45.874 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:06:24 -0400 (0:00:00.233) 0:02:46.107 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:06:25 -0400 (0:00:00.276) 0:02:46.384 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:06:25 -0400 (0:00:00.228) 0:02:46.613 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:06:25 -0400 (0:00:00.333) 0:02:46.946 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:06:26 -0400 (0:00:00.224) 0:02:47.170 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:06:26 -0400 (0:00:00.175) 0:02:47.346 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:06:26 -0400 (0:00:00.177) 0:02:47.524 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:06:26 -0400 (0:00:00.143) 0:02:47.668 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:06:26 -0400 (0:00:00.216) 0:02:47.885 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:06:26 -0400 (0:00:00.246) 0:02:48.131 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:06:27 -0400 (0:00:00.220) 0:02:48.352 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:06:27 -0400 (0:00:00.184) 0:02:48.537 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:06:27 -0400 (0:00:00.134) 0:02:48.672 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:06:27 -0400 (0:00:00.207) 0:02:48.879 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:118 Monday 20 April 2026 05:06:30 -0400 (0:00:02.384) 0:02:51.264 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:06:30 -0400 (0:00:00.486) 0:02:51.750 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:06:30 -0400 (0:00:00.196) 0:02:51.947 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:06:31 -0400 (0:00:00.412) 0:02:52.359 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:06:31 -0400 (0:00:00.201) 0:02:52.560 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:06:31 -0400 (0:00:00.247) 0:02:52.808 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:06:33 -0400 (0:00:01.968) 0:02:54.776 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:06:33 -0400 (0:00:00.236) 0:02:55.013 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:06:36 -0400 (0:00:02.128) 0:02:57.141 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:06:36 -0400 (0:00:00.367) 0:02:57.508 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:06:36 -0400 (0:00:00.285) 0:02:57.793 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:06:36 -0400 (0:00:00.312) 0:02:58.105 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:06:37 -0400 (0:00:00.217) 0:02:58.323 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:06:37 -0400 (0:00:00.193) 0:02:58.516 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:06:37 -0400 (0:00:00.608) 0:02:59.124 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:06:38 -0400 (0:00:00.257) 0:02:59.382 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:06:38 -0400 (0:00:00.360) 0:02:59.743 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:06:43 -0400 (0:00:04.440) 0:03:04.184 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:06:43 -0400 (0:00:00.244) 0:03:04.429 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:06:43 -0400 (0:00:00.239) 0:03:04.669 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:06:49 -0400 (0:00:05.691) 0:03:10.360 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:06:49 -0400 (0:00:00.414) 0:03:10.774 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:06:49 -0400 (0:00:00.203) 0:03:10.978 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:06:50 -0400 (0:00:00.248) 0:03:11.227 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:06:50 -0400 (0:00:00.173) 0:03:11.400 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:06:54 -0400 (0:00:03.779) 0:03:15.180 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:06:56 -0400 (0:00:02.580) 0:03:17.760 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:06:56 -0400 (0:00:00.184) 0:03:17.944 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:07:02 -0400 (0:00:05.281) 0:03:23.226 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:07:02 -0400 (0:00:00.249) 0:03:23.475 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:07:02 -0400 (0:00:00.359) 0:03:23.834 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:07:02 -0400 (0:00:00.204) 0:03:24.038 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:07:03 -0400 (0:00:00.431) 0:03:24.470 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:07:03 -0400 (0:00:00.230) 0:03:24.700 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675989.9086397, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776675989.9086397, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776675989.9086397, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2295265881", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:07:04 -0400 (0:00:01.391) 0:03:26.092 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:139 Monday 20 April 2026 05:07:05 -0400 (0:00:00.141) 0:03:26.234 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:07:05 -0400 (0:00:00.275) 0:03:26.510 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:07:05 -0400 (0:00:00.171) 0:03:26.682 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:07:05 -0400 (0:00:00.329) 0:03:27.012 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:07:07 -0400 (0:00:01.388) 0:03:28.400 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:07:07 -0400 (0:00:00.109) 0:03:28.510 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:07:09 -0400 (0:00:01.648) 0:03:30.158 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:07:09 -0400 (0:00:00.232) 0:03:30.391 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:07:09 -0400 (0:00:00.136) 0:03:30.528 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:07:09 -0400 (0:00:00.129) 0:03:30.657 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:07:09 -0400 (0:00:00.096) 0:03:30.753 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:07:09 -0400 (0:00:00.138) 0:03:30.892 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:07:10 -0400 (0:00:00.298) 0:03:31.191 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:07:10 -0400 (0:00:00.298) 0:03:31.489 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:07:10 -0400 (0:00:00.176) 0:03:31.665 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:07:14 -0400 (0:00:04.165) 0:03:35.831 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:07:14 -0400 (0:00:00.185) 0:03:36.017 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:07:15 -0400 (0:00:00.158) 0:03:36.175 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:07:20 -0400 (0:00:05.254) 0:03:41.429 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:07:20 -0400 (0:00:00.198) 0:03:41.628 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:07:20 -0400 (0:00:00.184) 0:03:41.813 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:07:20 -0400 (0:00:00.216) 0:03:42.030 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:07:21 -0400 (0:00:00.116) 0:03:42.146 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:07:24 -0400 (0:00:03.886) 0:03:46.032 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:07:27 -0400 (0:00:02.322) 0:03:48.355 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:07:27 -0400 (0:00:00.210) 0:03:48.566 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:07:32 -0400 (0:00:05.433) 0:03:54.000 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:07:33 -0400 (0:00:00.198) 0:03:54.199 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675946.7247367, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "048e24524ee4739b6dec5912df57b237877e5bfd", "ctime": 1776675946.7227366, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776675946.7227366, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:07:34 -0400 (0:00:01.234) 0:03:55.434 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:07:35 -0400 (0:00:01.300) 0:03:56.735 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:07:35 -0400 (0:00:00.216) 0:03:56.952 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:07:36 -0400 (0:00:00.286) 0:03:57.238 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:07:36 -0400 (0:00:00.264) 0:03:57.502 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:07:36 -0400 (0:00:00.251) 0:03:57.753 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-00a7163b-2630-4914-9eff-8d6f78b6405b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:07:37 -0400 (0:00:01.338) 0:03:59.091 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:07:39 -0400 (0:00:01.555) 0:04:00.647 ********** changed: [managed-node2] => (item={'src': 'UUID=ca304cfd-7efc-4432-b90d-3d46627b677d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:07:40 -0400 (0:00:01.349) 0:04:01.997 ********** skipping: [managed-node2] => (item={'src': 'UUID=ca304cfd-7efc-4432-b90d-3d46627b677d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:07:41 -0400 (0:00:00.147) 0:04:02.144 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:07:42 -0400 (0:00:01.544) 0:04:03.688 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776675959.5577078, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c553e5cfd49ea96888d63bf4dd8c7b85daf18ec1", "ctime": 1776675951.544726, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 381681795, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776675951.5437257, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "4270562343", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:07:43 -0400 (0:00:01.152) 0:04:04.841 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:07:45 -0400 (0:00:01.356) 0:04:06.197 ********** ok: [managed-node2] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:152 Monday 20 April 2026 05:07:46 -0400 (0:00:01.376) 0:04:07.573 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:07:46 -0400 (0:00:00.218) 0:04:07.792 ********** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:07:47 -0400 (0:00:00.501) 0:04:08.293 ********** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:07:47 -0400 (0:00:00.159) 0:04:08.453 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ca304cfd-7efc-4432-b90d-3d46627b677d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:07:48 -0400 (0:00:00.871) 0:04:09.324 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002697", "end": "2026-04-20 05:07:49.228938", "rc": 0, "start": "2026-04-20 05:07:49.226241" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=ca304cfd-7efc-4432-b90d-3d46627b677d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:07:49 -0400 (0:00:01.241) 0:04:10.567 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002824", "end": "2026-04-20 05:07:50.503276", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:07:50.500452" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:07:50 -0400 (0:00:01.354) 0:04:11.921 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:07:50 -0400 (0:00:00.093) 0:04:12.014 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:07:51 -0400 (0:00:00.133) 0:04:12.147 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:07:51 -0400 (0:00:00.144) 0:04:12.291 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:07:52 -0400 (0:00:00.999) 0:04:13.290 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:07:52 -0400 (0:00:00.191) 0:04:13.482 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:07:52 -0400 (0:00:00.196) 0:04:13.678 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:07:52 -0400 (0:00:00.280) 0:04:13.959 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:07:53 -0400 (0:00:00.264) 0:04:14.223 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:07:53 -0400 (0:00:00.234) 0:04:14.457 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:07:53 -0400 (0:00:00.199) 0:04:14.656 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:07:53 -0400 (0:00:00.189) 0:04:14.846 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:07:53 -0400 (0:00:00.180) 0:04:15.026 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:07:54 -0400 (0:00:00.168) 0:04:15.194 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:07:54 -0400 (0:00:00.095) 0:04:15.290 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:07:54 -0400 (0:00:00.118) 0:04:15.409 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:07:54 -0400 (0:00:00.539) 0:04:15.948 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:07:54 -0400 (0:00:00.160) 0:04:16.109 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:07:55 -0400 (0:00:00.240) 0:04:16.350 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:07:55 -0400 (0:00:00.263) 0:04:16.614 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:07:55 -0400 (0:00:00.198) 0:04:16.812 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:07:55 -0400 (0:00:00.155) 0:04:16.968 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:07:56 -0400 (0:00:00.235) 0:04:17.204 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:07:56 -0400 (0:00:00.299) 0:04:17.504 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676052.564499, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676052.564499, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37259, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776676052.564499, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:07:57 -0400 (0:00:01.412) 0:04:18.916 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:07:57 -0400 (0:00:00.160) 0:04:19.077 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:07:58 -0400 (0:00:00.202) 0:04:19.279 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:07:58 -0400 (0:00:00.282) 0:04:19.562 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:07:58 -0400 (0:00:00.213) 0:04:19.776 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:07:58 -0400 (0:00:00.209) 0:04:19.985 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:07:59 -0400 (0:00:00.232) 0:04:20.217 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:07:59 -0400 (0:00:00.170) 0:04:20.388 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:08:03 -0400 (0:00:04.198) 0:04:24.586 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:08:03 -0400 (0:00:00.234) 0:04:24.821 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:08:03 -0400 (0:00:00.204) 0:04:25.025 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:08:04 -0400 (0:00:00.305) 0:04:25.331 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:08:04 -0400 (0:00:00.209) 0:04:25.540 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:08:04 -0400 (0:00:00.290) 0:04:25.830 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:08:04 -0400 (0:00:00.213) 0:04:26.044 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:08:05 -0400 (0:00:00.249) 0:04:26.293 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:08:05 -0400 (0:00:00.219) 0:04:26.512 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:08:05 -0400 (0:00:00.212) 0:04:26.724 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:08:05 -0400 (0:00:00.261) 0:04:26.986 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:08:06 -0400 (0:00:00.231) 0:04:27.218 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:08:06 -0400 (0:00:00.342) 0:04:27.560 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:08:06 -0400 (0:00:00.229) 0:04:27.790 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:08:06 -0400 (0:00:00.170) 0:04:27.960 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:08:07 -0400 (0:00:00.239) 0:04:28.199 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:08:07 -0400 (0:00:00.283) 0:04:28.482 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:08:07 -0400 (0:00:00.205) 0:04:28.688 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:08:07 -0400 (0:00:00.279) 0:04:28.967 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:08:08 -0400 (0:00:00.193) 0:04:29.161 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:08:08 -0400 (0:00:00.260) 0:04:29.422 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:08:08 -0400 (0:00:00.243) 0:04:29.665 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:08:08 -0400 (0:00:00.215) 0:04:29.880 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:08:08 -0400 (0:00:00.246) 0:04:30.127 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:08:09 -0400 (0:00:00.233) 0:04:30.360 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:08:09 -0400 (0:00:00.131) 0:04:30.492 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:08:09 -0400 (0:00:00.197) 0:04:30.689 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:08:09 -0400 (0:00:00.288) 0:04:30.978 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:08:10 -0400 (0:00:00.267) 0:04:31.245 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:08:10 -0400 (0:00:00.248) 0:04:31.493 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:08:10 -0400 (0:00:00.190) 0:04:31.683 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:08:10 -0400 (0:00:00.289) 0:04:31.973 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:08:11 -0400 (0:00:00.277) 0:04:32.250 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:08:11 -0400 (0:00:00.245) 0:04:32.496 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:08:11 -0400 (0:00:00.262) 0:04:32.758 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:08:11 -0400 (0:00:00.286) 0:04:33.045 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:08:12 -0400 (0:00:00.250) 0:04:33.296 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:08:12 -0400 (0:00:00.255) 0:04:33.551 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:08:12 -0400 (0:00:00.244) 0:04:33.796 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:08:12 -0400 (0:00:00.231) 0:04:34.027 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:08:13 -0400 (0:00:00.285) 0:04:34.313 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:08:13 -0400 (0:00:00.264) 0:04:34.577 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:08:13 -0400 (0:00:00.279) 0:04:34.856 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:08:13 -0400 (0:00:00.158) 0:04:35.015 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:08:14 -0400 (0:00:00.210) 0:04:35.226 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:08:14 -0400 (0:00:00.205) 0:04:35.431 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:08:14 -0400 (0:00:00.234) 0:04:35.666 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:08:14 -0400 (0:00:00.261) 0:04:35.928 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:08:15 -0400 (0:00:00.219) 0:04:36.147 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:08:15 -0400 (0:00:00.257) 0:04:36.404 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:08:15 -0400 (0:00:00.235) 0:04:36.640 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:08:15 -0400 (0:00:00.209) 0:04:36.849 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:08:15 -0400 (0:00:00.155) 0:04:37.005 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:08:16 -0400 (0:00:00.180) 0:04:37.185 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:08:16 -0400 (0:00:00.194) 0:04:37.380 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:08:16 -0400 (0:00:00.171) 0:04:37.551 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:08:16 -0400 (0:00:00.203) 0:04:37.755 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:08:16 -0400 (0:00:00.265) 0:04:38.021 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:08:17 -0400 (0:00:00.205) 0:04:38.226 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:08:17 -0400 (0:00:00.167) 0:04:38.394 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:08:17 -0400 (0:00:00.162) 0:04:38.556 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:158 Monday 20 April 2026 05:08:19 -0400 (0:00:01.682) 0:04:40.239 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:08:19 -0400 (0:00:00.459) 0:04:40.698 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:08:19 -0400 (0:00:00.207) 0:04:40.906 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:08:20 -0400 (0:00:00.276) 0:04:41.183 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:08:20 -0400 (0:00:00.206) 0:04:41.389 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:08:21 -0400 (0:00:00.788) 0:04:42.178 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:08:22 -0400 (0:00:01.710) 0:04:43.888 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:08:22 -0400 (0:00:00.194) 0:04:44.083 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:08:24 -0400 (0:00:02.044) 0:04:46.128 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:08:25 -0400 (0:00:00.514) 0:04:46.642 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:08:25 -0400 (0:00:00.293) 0:04:46.936 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:08:25 -0400 (0:00:00.181) 0:04:47.117 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:08:26 -0400 (0:00:00.179) 0:04:47.297 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:08:26 -0400 (0:00:00.124) 0:04:47.422 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:08:26 -0400 (0:00:00.429) 0:04:47.851 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:08:26 -0400 (0:00:00.286) 0:04:48.137 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:08:27 -0400 (0:00:00.141) 0:04:48.279 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:08:31 -0400 (0:00:03.968) 0:04:52.248 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:08:31 -0400 (0:00:00.295) 0:04:52.543 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:08:31 -0400 (0:00:00.199) 0:04:52.743 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:08:36 -0400 (0:00:05.305) 0:04:58.048 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:08:37 -0400 (0:00:00.374) 0:04:58.423 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:08:37 -0400 (0:00:00.140) 0:04:58.563 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:08:37 -0400 (0:00:00.293) 0:04:58.856 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:08:37 -0400 (0:00:00.137) 0:04:58.994 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:08:42 -0400 (0:00:04.360) 0:05:03.355 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service": { "name": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service": { "name": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:08:45 -0400 (0:00:02.853) 0:05:06.208 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d00a7163b\x2d2630\x2d4914\x2d9eff\x2d8d6f78b6405b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "name": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-00a7163b-2630-4914-9eff-8d6f78b6405b", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-00a7163b-2630-4914-9eff-8d6f78b6405b /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-00a7163b-2630-4914-9eff-8d6f78b6405b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:07:42 EDT", "StateChangeTimestampMonotonic": "2000320275", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d2630\x2d4914\x2d9eff\x2d8d6f78b6405b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "name": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:08:48 -0400 (0:00:03.124) 0:05:09.333 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:08:53 -0400 (0:00:05.611) 0:05:14.945 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:08:53 -0400 (0:00:00.181) 0:05:15.127 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d00a7163b\x2d2630\x2d4914\x2d9eff\x2d8d6f78b6405b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "name": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d00a7163b\\x2d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d2630\x2d4914\x2d9eff\x2d8d6f78b6405b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "name": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d2630\\x2d4914\\x2d9eff\\x2d8d6f78b6405b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:08:57 -0400 (0:00:03.550) 0:05:18.677 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:08:57 -0400 (0:00:00.136) 0:05:18.814 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:08:57 -0400 (0:00:00.218) 0:05:19.033 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:08:57 -0400 (0:00:00.065) 0:05:19.098 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676098.8063953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676098.8063953, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776676098.8063953, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "925193889", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:08:58 -0400 (0:00:00.991) 0:05:20.090 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:179 Monday 20 April 2026 05:08:59 -0400 (0:00:00.252) 0:05:20.342 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:08:59 -0400 (0:00:00.372) 0:05:20.714 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:08:59 -0400 (0:00:00.188) 0:05:20.903 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:08:59 -0400 (0:00:00.156) 0:05:21.059 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:09:01 -0400 (0:00:01.361) 0:05:22.420 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:09:01 -0400 (0:00:00.199) 0:05:22.620 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:09:03 -0400 (0:00:01.657) 0:05:24.278 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:09:03 -0400 (0:00:00.196) 0:05:24.475 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:09:03 -0400 (0:00:00.180) 0:05:24.656 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:09:03 -0400 (0:00:00.099) 0:05:24.756 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:09:03 -0400 (0:00:00.037) 0:05:24.793 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:09:03 -0400 (0:00:00.091) 0:05:24.884 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:09:04 -0400 (0:00:00.291) 0:05:25.176 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:09:04 -0400 (0:00:00.104) 0:05:25.281 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:09:04 -0400 (0:00:00.103) 0:05:25.384 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:09:07 -0400 (0:00:03.731) 0:05:29.116 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:09:08 -0400 (0:00:00.117) 0:05:29.233 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:09:08 -0400 (0:00:00.074) 0:05:29.308 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:09:12 -0400 (0:00:04.409) 0:05:33.718 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:09:12 -0400 (0:00:00.129) 0:05:33.847 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:09:12 -0400 (0:00:00.077) 0:05:33.924 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:09:12 -0400 (0:00:00.052) 0:05:33.977 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:09:12 -0400 (0:00:00.066) 0:05:34.043 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:09:16 -0400 (0:00:03.511) 0:05:37.554 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:09:18 -0400 (0:00:02.206) 0:05:39.760 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:09:18 -0400 (0:00:00.345) 0:05:40.106 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:09:32 -0400 (0:00:13.419) 0:05:53.525 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:09:32 -0400 (0:00:00.299) 0:05:53.825 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676060.608481, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1ec0d349c722cf5475306973d39f1251f464d37f", "ctime": 1776676060.604481, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676060.604481, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:09:33 -0400 (0:00:01.256) 0:05:55.081 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:09:35 -0400 (0:00:01.091) 0:05:56.172 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:09:35 -0400 (0:00:00.235) 0:05:56.408 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:09:35 -0400 (0:00:00.208) 0:05:56.616 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:09:35 -0400 (0:00:00.245) 0:05:56.862 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:09:35 -0400 (0:00:00.202) 0:05:57.065 ********** changed: [managed-node2] => (item={'src': 'UUID=ca304cfd-7efc-4432-b90d-3d46627b677d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ca304cfd-7efc-4432-b90d-3d46627b677d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:09:37 -0400 (0:00:01.567) 0:05:58.632 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:09:38 -0400 (0:00:01.359) 0:05:59.992 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:09:40 -0400 (0:00:01.357) 0:06:01.349 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:09:40 -0400 (0:00:00.100) 0:06:01.450 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:09:41 -0400 (0:00:01.606) 0:06:03.057 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676070.5014586, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676064.8684714, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 517996677, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776676064.8674715, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "586199598", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:09:42 -0400 (0:00:01.058) 0:06:04.115 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:09:44 -0400 (0:00:01.278) 0:06:05.393 ********** ok: [managed-node2] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:192 Monday 20 April 2026 05:09:45 -0400 (0:00:01.609) 0:06:07.003 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:09:46 -0400 (0:00:00.551) 0:06:07.555 ********** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:09:46 -0400 (0:00:00.342) 0:06:07.897 ********** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:09:46 -0400 (0:00:00.242) 0:06:08.139 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "size": "10G", "type": "crypt", "uuid": "0ee979cf-41e2-4995-af71-3ee586342a0b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0b79776e-eb2f-4731-b6ba-8e4ca51529c4" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:09:48 -0400 (0:00:01.605) 0:06:09.744 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002443", "end": "2026-04-20 05:09:49.891326", "rc": 0, "start": "2026-04-20 05:09:49.888883" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:09:50 -0400 (0:00:01.451) 0:06:11.196 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002701", "end": "2026-04-20 05:09:51.028670", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:09:51.025969" } STDOUT: luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:09:51 -0400 (0:00:01.219) 0:06:12.416 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:09:51 -0400 (0:00:00.135) 0:06:12.551 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:09:51 -0400 (0:00:00.314) 0:06:12.865 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:09:51 -0400 (0:00:00.146) 0:06:13.011 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:09:53 -0400 (0:00:01.314) 0:06:14.326 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:09:53 -0400 (0:00:00.207) 0:06:14.533 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:09:53 -0400 (0:00:00.185) 0:06:14.718 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:09:53 -0400 (0:00:00.254) 0:06:14.973 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:09:54 -0400 (0:00:00.260) 0:06:15.233 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:09:54 -0400 (0:00:00.294) 0:06:15.528 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:09:54 -0400 (0:00:00.249) 0:06:15.777 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:09:54 -0400 (0:00:00.267) 0:06:16.045 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:09:55 -0400 (0:00:00.137) 0:06:16.182 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:09:55 -0400 (0:00:00.139) 0:06:16.321 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:09:55 -0400 (0:00:00.122) 0:06:16.444 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:09:55 -0400 (0:00:00.197) 0:06:16.641 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:09:55 -0400 (0:00:00.430) 0:06:17.071 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:09:56 -0400 (0:00:00.204) 0:06:17.276 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:09:56 -0400 (0:00:00.297) 0:06:17.573 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:09:56 -0400 (0:00:00.188) 0:06:17.762 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:09:56 -0400 (0:00:00.272) 0:06:18.035 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:09:57 -0400 (0:00:00.134) 0:06:18.170 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:09:57 -0400 (0:00:00.241) 0:06:18.411 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:09:57 -0400 (0:00:00.367) 0:06:18.778 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676171.8802311, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676171.8802311, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37259, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776676171.8802311, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:09:59 -0400 (0:00:01.403) 0:06:20.182 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:09:59 -0400 (0:00:00.124) 0:06:20.307 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:09:59 -0400 (0:00:00.223) 0:06:20.530 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:09:59 -0400 (0:00:00.223) 0:06:20.754 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:09:59 -0400 (0:00:00.191) 0:06:20.946 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:09:59 -0400 (0:00:00.192) 0:06:21.139 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:10:00 -0400 (0:00:00.179) 0:06:21.319 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676172.0342307, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676172.0342307, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 201433, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676172.0342307, "nlink": 1, "path": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:10:01 -0400 (0:00:01.365) 0:06:22.684 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:10:05 -0400 (0:00:04.113) 0:06:26.798 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011224", "end": "2026-04-20 05:10:06.679429", "rc": 0, "start": "2026-04-20 05:10:06.668205" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 0b79776e-eb2f-4731-b6ba-8e4ca51529c4 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 937509 Threads: 2 Salt: 7e df 6c b4 15 1d bb f3 d0 ff 51 c9 b3 6c 2b 60 96 c5 dd 2e 8e ba a6 8c 16 23 15 4f 1d 7c 7b de AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 7f 10 a4 70 d0 5f 37 55 d6 c1 b1 bc 86 f4 9b dd 65 2d 8f 2a 80 01 15 84 cf 5c 23 32 0c c3 18 f3 Digest: c0 0b 6f 07 79 a3 17 66 bb 7d 4f da 1a e4 3f 9e 9d 56 df 94 0e 7c 9b dc 9b e1 30 f1 9d 61 15 86 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:10:06 -0400 (0:00:01.257) 0:06:28.055 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:10:07 -0400 (0:00:00.190) 0:06:28.246 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:10:07 -0400 (0:00:00.221) 0:06:28.468 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:10:07 -0400 (0:00:00.216) 0:06:28.684 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:10:07 -0400 (0:00:00.307) 0:06:28.991 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:10:07 -0400 (0:00:00.126) 0:06:29.118 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:10:08 -0400 (0:00:00.149) 0:06:29.267 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:10:08 -0400 (0:00:00.163) 0:06:29.431 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:10:08 -0400 (0:00:00.107) 0:06:29.539 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:10:08 -0400 (0:00:00.300) 0:06:29.840 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:10:08 -0400 (0:00:00.250) 0:06:30.091 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:10:09 -0400 (0:00:00.277) 0:06:30.368 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:10:09 -0400 (0:00:00.247) 0:06:30.616 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:10:09 -0400 (0:00:00.161) 0:06:30.778 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:10:09 -0400 (0:00:00.274) 0:06:31.052 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:10:10 -0400 (0:00:00.304) 0:06:31.356 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:10:10 -0400 (0:00:00.214) 0:06:31.571 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:10:10 -0400 (0:00:00.188) 0:06:31.760 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:10:10 -0400 (0:00:00.274) 0:06:32.034 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:10:11 -0400 (0:00:00.252) 0:06:32.286 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:10:11 -0400 (0:00:00.229) 0:06:32.516 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:10:11 -0400 (0:00:00.380) 0:06:32.896 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:10:11 -0400 (0:00:00.195) 0:06:33.091 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:10:12 -0400 (0:00:00.250) 0:06:33.342 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:10:12 -0400 (0:00:00.139) 0:06:33.481 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:10:12 -0400 (0:00:00.243) 0:06:33.725 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:10:12 -0400 (0:00:00.240) 0:06:33.965 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:10:13 -0400 (0:00:00.197) 0:06:34.163 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:10:13 -0400 (0:00:00.310) 0:06:34.474 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:10:13 -0400 (0:00:00.269) 0:06:34.743 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:10:13 -0400 (0:00:00.191) 0:06:34.935 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:10:14 -0400 (0:00:00.238) 0:06:35.173 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:10:14 -0400 (0:00:00.192) 0:06:35.366 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:10:14 -0400 (0:00:00.191) 0:06:35.558 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:10:14 -0400 (0:00:00.271) 0:06:35.830 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:10:14 -0400 (0:00:00.173) 0:06:36.004 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:10:15 -0400 (0:00:00.245) 0:06:36.249 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:10:15 -0400 (0:00:00.175) 0:06:36.424 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:10:15 -0400 (0:00:00.261) 0:06:36.686 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:10:15 -0400 (0:00:00.297) 0:06:36.984 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:10:16 -0400 (0:00:00.235) 0:06:37.219 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:10:16 -0400 (0:00:00.282) 0:06:37.501 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:10:16 -0400 (0:00:00.262) 0:06:37.764 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:10:16 -0400 (0:00:00.247) 0:06:38.011 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:10:17 -0400 (0:00:00.168) 0:06:38.180 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:10:17 -0400 (0:00:00.240) 0:06:38.420 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:10:17 -0400 (0:00:00.199) 0:06:38.620 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:10:17 -0400 (0:00:00.221) 0:06:38.841 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:10:17 -0400 (0:00:00.263) 0:06:39.105 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:10:18 -0400 (0:00:00.240) 0:06:39.345 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:10:18 -0400 (0:00:00.257) 0:06:39.603 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:10:18 -0400 (0:00:00.203) 0:06:39.806 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:10:18 -0400 (0:00:00.168) 0:06:39.975 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:10:19 -0400 (0:00:00.173) 0:06:40.148 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:10:19 -0400 (0:00:00.240) 0:06:40.388 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:10:19 -0400 (0:00:00.259) 0:06:40.648 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:10:19 -0400 (0:00:00.184) 0:06:40.833 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:10:19 -0400 (0:00:00.080) 0:06:40.913 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:10:19 -0400 (0:00:00.098) 0:06:41.012 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:199 Monday 20 April 2026 05:10:19 -0400 (0:00:00.104) 0:06:41.116 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:10:20 -0400 (0:00:00.323) 0:06:41.440 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:10:20 -0400 (0:00:00.139) 0:06:41.579 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:10:20 -0400 (0:00:00.174) 0:06:41.754 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:10:20 -0400 (0:00:00.115) 0:06:41.869 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:10:20 -0400 (0:00:00.189) 0:06:42.059 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:10:22 -0400 (0:00:01.431) 0:06:43.491 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:10:22 -0400 (0:00:00.194) 0:06:43.686 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:10:24 -0400 (0:00:01.903) 0:06:45.589 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:10:24 -0400 (0:00:00.345) 0:06:45.935 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:10:24 -0400 (0:00:00.182) 0:06:46.117 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:10:25 -0400 (0:00:00.211) 0:06:46.329 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:10:25 -0400 (0:00:00.258) 0:06:46.588 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:10:25 -0400 (0:00:00.131) 0:06:46.719 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:10:26 -0400 (0:00:00.479) 0:06:47.198 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:10:26 -0400 (0:00:00.158) 0:06:47.357 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:10:26 -0400 (0:00:00.213) 0:06:47.570 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:10:30 -0400 (0:00:03.911) 0:06:51.482 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:10:30 -0400 (0:00:00.225) 0:06:51.708 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:10:30 -0400 (0:00:00.289) 0:06:51.997 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:10:36 -0400 (0:00:05.498) 0:06:57.496 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:10:36 -0400 (0:00:00.405) 0:06:57.902 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:10:36 -0400 (0:00:00.183) 0:06:58.085 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:10:37 -0400 (0:00:00.276) 0:06:58.362 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:10:37 -0400 (0:00:00.201) 0:06:58.563 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:10:41 -0400 (0:00:04.528) 0:07:03.092 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:10:44 -0400 (0:00:02.420) 0:07:05.513 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:10:44 -0400 (0:00:00.266) 0:07:05.780 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:10:50 -0400 (0:00:05.401) 0:07:11.182 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:10:50 -0400 (0:00:00.278) 0:07:11.460 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:10:50 -0400 (0:00:00.376) 0:07:11.836 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:10:50 -0400 (0:00:00.216) 0:07:12.053 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:10:51 -0400 (0:00:00.310) 0:07:12.363 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:219 Monday 20 April 2026 05:10:51 -0400 (0:00:00.154) 0:07:12.518 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:10:51 -0400 (0:00:00.560) 0:07:13.078 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:10:52 -0400 (0:00:00.196) 0:07:13.275 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:10:52 -0400 (0:00:00.190) 0:07:13.466 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:10:54 -0400 (0:00:01.927) 0:07:15.393 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:10:54 -0400 (0:00:00.269) 0:07:15.663 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:10:56 -0400 (0:00:01.816) 0:07:17.480 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:10:56 -0400 (0:00:00.434) 0:07:17.915 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:10:57 -0400 (0:00:00.239) 0:07:18.154 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:10:57 -0400 (0:00:00.273) 0:07:18.428 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:10:57 -0400 (0:00:00.170) 0:07:18.599 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:10:57 -0400 (0:00:00.138) 0:07:18.737 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:10:58 -0400 (0:00:00.440) 0:07:19.177 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:10:58 -0400 (0:00:00.199) 0:07:19.377 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:10:58 -0400 (0:00:00.233) 0:07:19.610 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:11:02 -0400 (0:00:03.787) 0:07:23.398 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:11:02 -0400 (0:00:00.202) 0:07:23.600 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:11:02 -0400 (0:00:00.169) 0:07:23.770 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:11:07 -0400 (0:00:05.215) 0:07:28.985 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:11:08 -0400 (0:00:00.247) 0:07:29.232 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:11:08 -0400 (0:00:00.187) 0:07:29.420 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:11:08 -0400 (0:00:00.166) 0:07:29.586 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:11:08 -0400 (0:00:00.186) 0:07:29.773 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:11:12 -0400 (0:00:04.275) 0:07:34.048 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:11:15 -0400 (0:00:02.427) 0:07:36.476 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:11:15 -0400 (0:00:00.272) 0:07:36.749 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:11:29 -0400 (0:00:13.940) 0:07:50.689 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:11:29 -0400 (0:00:00.251) 0:07:50.941 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676180.009213, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "47f722a816266f4247a5d78fbdda6d555ac00277", "ctime": 1776676180.006213, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676180.006213, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:11:31 -0400 (0:00:01.931) 0:07:52.873 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:11:33 -0400 (0:00:01.647) 0:07:54.520 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:11:33 -0400 (0:00:00.117) 0:07:54.638 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:11:33 -0400 (0:00:00.143) 0:07:54.782 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:11:33 -0400 (0:00:00.292) 0:07:55.075 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:11:34 -0400 (0:00:00.159) 0:07:55.234 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:11:35 -0400 (0:00:01.323) 0:07:56.558 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:11:37 -0400 (0:00:01.668) 0:07:58.226 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:11:38 -0400 (0:00:01.538) 0:07:59.765 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:11:38 -0400 (0:00:00.203) 0:07:59.968 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:11:40 -0400 (0:00:01.783) 0:08:01.752 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676191.0281882, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "dac4d61477bb392316072beb7ddd386d672e1067", "ctime": 1776676184.052204, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 136315086, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776676184.0502038, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "981726734", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:11:42 -0400 (0:00:01.403) 0:08:03.155 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:11:44 -0400 (0:00:02.616) 0:08:05.772 ********** ok: [managed-node2] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:236 Monday 20 April 2026 05:11:46 -0400 (0:00:01.590) 0:08:07.362 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:11:47 -0400 (0:00:01.271) 0:08:08.633 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:11:47 -0400 (0:00:00.259) 0:08:08.893 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:11:48 -0400 (0:00:00.273) 0:08:09.166 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "size": "4G", "type": "crypt", "uuid": "d4917ecc-7a0b-43b0-a595-335d624f1070" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "082dbd3e-5ddd-4519-a194-5805ce58d144" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:11:49 -0400 (0:00:01.339) 0:08:10.506 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.003707", "end": "2026-04-20 05:11:51.329516", "rc": 0, "start": "2026-04-20 05:11:50.325809" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:11:51 -0400 (0:00:02.155) 0:08:12.661 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002409", "end": "2026-04-20 05:11:52.532888", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:11:52.530479" } STDOUT: luks-082dbd3e-5ddd-4519-a194-5805ce58d144 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:11:52 -0400 (0:00:01.215) 0:08:13.876 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:11:52 -0400 (0:00:00.245) 0:08:14.122 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:11:53 -0400 (0:00:00.087) 0:08:14.210 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:11:53 -0400 (0:00:00.202) 0:08:14.412 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:11:53 -0400 (0:00:00.159) 0:08:14.572 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:11:53 -0400 (0:00:00.506) 0:08:15.079 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:11:54 -0400 (0:00:00.231) 0:08:15.311 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:11:54 -0400 (0:00:00.252) 0:08:15.563 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:11:54 -0400 (0:00:00.185) 0:08:15.748 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:11:54 -0400 (0:00:00.194) 0:08:15.943 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:11:55 -0400 (0:00:00.213) 0:08:16.156 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:11:55 -0400 (0:00:00.245) 0:08:16.402 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:11:55 -0400 (0:00:00.273) 0:08:16.676 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:11:55 -0400 (0:00:00.231) 0:08:16.908 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:11:55 -0400 (0:00:00.174) 0:08:17.082 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:11:57 -0400 (0:00:01.294) 0:08:18.376 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:11:57 -0400 (0:00:00.105) 0:08:18.481 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:11:57 -0400 (0:00:00.314) 0:08:18.796 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:11:57 -0400 (0:00:00.166) 0:08:18.962 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:11:57 -0400 (0:00:00.164) 0:08:19.127 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:11:58 -0400 (0:00:00.181) 0:08:19.309 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:11:58 -0400 (0:00:00.111) 0:08:19.420 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:11:58 -0400 (0:00:00.108) 0:08:19.528 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:11:58 -0400 (0:00:00.091) 0:08:19.619 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:11:58 -0400 (0:00:00.162) 0:08:19.782 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:11:58 -0400 (0:00:00.205) 0:08:19.987 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:11:59 -0400 (0:00:00.154) 0:08:20.142 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:11:59 -0400 (0:00:00.135) 0:08:20.278 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:11:59 -0400 (0:00:00.145) 0:08:20.423 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:11:59 -0400 (0:00:00.347) 0:08:20.771 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:11:59 -0400 (0:00:00.229) 0:08:21.001 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:12:00 -0400 (0:00:00.398) 0:08:21.399 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:12:00 -0400 (0:00:00.269) 0:08:21.669 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:12:00 -0400 (0:00:00.367) 0:08:22.036 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:12:01 -0400 (0:00:00.250) 0:08:22.287 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:12:01 -0400 (0:00:00.167) 0:08:22.455 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:12:01 -0400 (0:00:00.315) 0:08:22.770 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:12:01 -0400 (0:00:00.215) 0:08:22.985 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:12:02 -0400 (0:00:00.383) 0:08:23.369 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:12:02 -0400 (0:00:00.294) 0:08:23.663 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:12:03 -0400 (0:00:00.506) 0:08:24.170 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:12:03 -0400 (0:00:00.253) 0:08:24.423 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:12:03 -0400 (0:00:00.251) 0:08:24.675 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:12:03 -0400 (0:00:00.100) 0:08:24.776 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:12:03 -0400 (0:00:00.188) 0:08:24.965 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:12:03 -0400 (0:00:00.153) 0:08:25.118 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:12:04 -0400 (0:00:00.197) 0:08:25.316 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:12:04 -0400 (0:00:00.171) 0:08:25.487 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:12:04 -0400 (0:00:00.137) 0:08:25.625 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:12:04 -0400 (0:00:00.363) 0:08:25.989 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:12:05 -0400 (0:00:00.281) 0:08:26.270 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:12:06 -0400 (0:00:01.063) 0:08:27.334 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:12:06 -0400 (0:00:00.200) 0:08:27.535 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:12:06 -0400 (0:00:00.180) 0:08:27.715 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:12:07 -0400 (0:00:00.750) 0:08:28.466 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:12:07 -0400 (0:00:00.291) 0:08:28.758 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:12:07 -0400 (0:00:00.094) 0:08:28.853 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:12:07 -0400 (0:00:00.145) 0:08:28.998 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:12:08 -0400 (0:00:00.191) 0:08:29.190 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:12:08 -0400 (0:00:00.261) 0:08:29.452 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:12:08 -0400 (0:00:00.135) 0:08:29.587 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:12:08 -0400 (0:00:00.126) 0:08:29.714 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:12:08 -0400 (0:00:00.228) 0:08:29.942 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:12:09 -0400 (0:00:00.365) 0:08:30.307 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:12:09 -0400 (0:00:00.119) 0:08:30.427 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:12:09 -0400 (0:00:00.176) 0:08:30.604 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:12:09 -0400 (0:00:00.222) 0:08:30.826 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:12:10 -0400 (0:00:00.360) 0:08:31.186 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:12:10 -0400 (0:00:00.157) 0:08:31.344 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:12:10 -0400 (0:00:00.171) 0:08:31.515 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:12:10 -0400 (0:00:00.251) 0:08:31.767 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676289.0209682, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676289.0209682, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217220, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776676289.0209682, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:12:12 -0400 (0:00:01.461) 0:08:33.229 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:12:12 -0400 (0:00:00.229) 0:08:33.458 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:12:12 -0400 (0:00:00.292) 0:08:33.750 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:12:12 -0400 (0:00:00.283) 0:08:34.034 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:12:13 -0400 (0:00:00.263) 0:08:34.297 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:12:13 -0400 (0:00:00.263) 0:08:34.561 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:12:13 -0400 (0:00:00.240) 0:08:34.802 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676289.175968, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676289.175968, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 216851, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676289.175968, "nlink": 1, "path": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:12:15 -0400 (0:00:01.621) 0:08:36.424 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:12:19 -0400 (0:00:04.295) 0:08:40.719 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010071", "end": "2026-04-20 05:12:20.822255", "rc": 0, "start": "2026-04-20 05:12:20.812184" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 082dbd3e-5ddd-4519-a194-5805ce58d144 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 917562 Threads: 2 Salt: a4 08 eb 96 d6 5c ee f9 35 e6 0d c8 7a 84 dc 4c e1 c1 45 eb f6 7b be b3 32 0b 7a fe 65 76 d6 bb AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 6d 9d 3a a1 0d 3b db cc e2 33 e9 a3 04 6d 4b 9a 4f 50 c2 e8 55 e6 24 24 1c c0 7a d6 28 2a 65 0e Digest: 33 ae c3 5b c3 44 20 2a 87 32 d7 ba aa 85 8f ca 40 d0 82 50 71 91 91 6b 66 f4 fc 3c 4c 65 89 ee TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:12:21 -0400 (0:00:01.539) 0:08:42.259 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:12:21 -0400 (0:00:00.256) 0:08:42.515 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:12:21 -0400 (0:00:00.265) 0:08:42.781 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:12:21 -0400 (0:00:00.271) 0:08:43.052 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:12:22 -0400 (0:00:00.276) 0:08:43.328 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:12:22 -0400 (0:00:00.368) 0:08:43.696 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:12:22 -0400 (0:00:00.332) 0:08:44.029 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:12:23 -0400 (0:00:00.263) 0:08:44.292 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-082dbd3e-5ddd-4519-a194-5805ce58d144 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:12:23 -0400 (0:00:00.333) 0:08:44.626 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:12:23 -0400 (0:00:00.290) 0:08:44.916 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:12:24 -0400 (0:00:00.261) 0:08:45.178 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:12:24 -0400 (0:00:00.359) 0:08:45.537 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:12:24 -0400 (0:00:00.354) 0:08:45.892 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:12:24 -0400 (0:00:00.188) 0:08:46.080 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:12:25 -0400 (0:00:00.232) 0:08:46.313 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:12:25 -0400 (0:00:00.296) 0:08:46.609 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:12:25 -0400 (0:00:00.292) 0:08:46.902 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:12:25 -0400 (0:00:00.227) 0:08:47.130 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:12:26 -0400 (0:00:00.277) 0:08:47.407 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:12:26 -0400 (0:00:00.281) 0:08:47.689 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:12:26 -0400 (0:00:00.308) 0:08:47.998 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:12:27 -0400 (0:00:00.233) 0:08:48.232 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:12:27 -0400 (0:00:00.160) 0:08:48.393 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:12:27 -0400 (0:00:00.251) 0:08:48.644 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:12:27 -0400 (0:00:00.255) 0:08:48.900 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:12:27 -0400 (0:00:00.225) 0:08:49.126 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:12:28 -0400 (0:00:00.263) 0:08:49.389 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:12:28 -0400 (0:00:00.235) 0:08:49.625 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:12:28 -0400 (0:00:00.180) 0:08:49.805 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:12:28 -0400 (0:00:00.161) 0:08:49.967 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:12:29 -0400 (0:00:00.256) 0:08:50.224 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:12:29 -0400 (0:00:00.325) 0:08:50.549 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:12:29 -0400 (0:00:00.270) 0:08:50.820 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:12:29 -0400 (0:00:00.290) 0:08:51.110 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:12:30 -0400 (0:00:00.327) 0:08:51.438 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:12:30 -0400 (0:00:00.379) 0:08:51.817 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:12:31 -0400 (0:00:00.339) 0:08:52.156 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:12:31 -0400 (0:00:00.285) 0:08:52.442 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:12:31 -0400 (0:00:00.246) 0:08:52.688 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:12:31 -0400 (0:00:00.248) 0:08:52.937 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:12:32 -0400 (0:00:00.349) 0:08:53.287 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:12:32 -0400 (0:00:00.218) 0:08:53.505 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:12:32 -0400 (0:00:00.195) 0:08:53.700 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:12:32 -0400 (0:00:00.197) 0:08:53.898 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:12:32 -0400 (0:00:00.209) 0:08:54.107 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:12:33 -0400 (0:00:00.231) 0:08:54.338 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:12:33 -0400 (0:00:00.279) 0:08:54.618 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:12:33 -0400 (0:00:00.348) 0:08:54.967 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:12:34 -0400 (0:00:00.270) 0:08:55.238 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:12:34 -0400 (0:00:00.365) 0:08:55.603 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:12:34 -0400 (0:00:00.280) 0:08:55.883 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:12:35 -0400 (0:00:00.339) 0:08:56.222 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:12:35 -0400 (0:00:00.224) 0:08:56.447 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:12:35 -0400 (0:00:00.369) 0:08:56.816 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:12:35 -0400 (0:00:00.291) 0:08:57.108 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:12:36 -0400 (0:00:00.261) 0:08:57.370 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:12:36 -0400 (0:00:00.270) 0:08:57.641 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:12:36 -0400 (0:00:00.299) 0:08:57.940 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:12:37 -0400 (0:00:00.200) 0:08:58.141 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:12:37 -0400 (0:00:00.226) 0:08:58.367 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:12:37 -0400 (0:00:00.191) 0:08:58.559 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:242 Monday 20 April 2026 05:12:39 -0400 (0:00:01.605) 0:09:00.164 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:12:39 -0400 (0:00:00.633) 0:09:00.798 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:12:39 -0400 (0:00:00.193) 0:09:00.992 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:12:40 -0400 (0:00:00.299) 0:09:01.291 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:12:40 -0400 (0:00:00.203) 0:09:01.495 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:12:40 -0400 (0:00:00.247) 0:09:01.742 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:12:42 -0400 (0:00:01.444) 0:09:03.187 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:12:42 -0400 (0:00:00.245) 0:09:03.432 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:12:44 -0400 (0:00:02.191) 0:09:05.624 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:12:44 -0400 (0:00:00.508) 0:09:06.132 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:12:45 -0400 (0:00:00.207) 0:09:06.340 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:12:45 -0400 (0:00:00.264) 0:09:06.605 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:12:45 -0400 (0:00:00.230) 0:09:06.835 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:12:45 -0400 (0:00:00.223) 0:09:07.059 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:12:46 -0400 (0:00:00.639) 0:09:07.699 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:12:46 -0400 (0:00:00.236) 0:09:07.935 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:12:47 -0400 (0:00:00.256) 0:09:08.192 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:12:51 -0400 (0:00:04.309) 0:09:12.501 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:12:51 -0400 (0:00:00.176) 0:09:12.678 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:12:51 -0400 (0:00:00.190) 0:09:12.869 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:12:57 -0400 (0:00:05.556) 0:09:18.425 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:12:57 -0400 (0:00:00.256) 0:09:18.682 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:12:57 -0400 (0:00:00.192) 0:09:18.874 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:12:57 -0400 (0:00:00.194) 0:09:19.069 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:12:58 -0400 (0:00:00.163) 0:09:19.232 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:13:02 -0400 (0:00:03.975) 0:09:23.207 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service": { "name": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service": { "name": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:13:05 -0400 (0:00:03.185) 0:09:26.393 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d0b79776e\x2deb2f\x2d4731\x2db6ba\x2d8e4ca51529c4.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "name": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0b79776e-eb2f-4731-b6ba-8e4ca51529c4 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:11:40 EDT", "StateChangeTimestampMonotonic": "2238378565", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...deb2f\x2d4731\x2db6ba\x2d8e4ca51529c4.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "name": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:13:08 -0400 (0:00:03.478) 0:09:29.871 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:13:14 -0400 (0:00:05.326) 0:09:35.197 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:13:14 -0400 (0:00:00.236) 0:09:35.434 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d0b79776e\x2deb2f\x2d4731\x2db6ba\x2d8e4ca51529c4.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "name": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d0b79776e\\x2deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...deb2f\x2d4731\x2db6ba\x2d8e4ca51529c4.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "name": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...deb2f\\x2d4731\\x2db6ba\\x2d8e4ca51529c4.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:13:17 -0400 (0:00:03.075) 0:09:38.510 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:13:17 -0400 (0:00:00.253) 0:09:38.763 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:13:17 -0400 (0:00:00.235) 0:09:38.999 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:13:18 -0400 (0:00:00.160) 0:09:39.160 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676358.7598119, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676358.7598119, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776676358.7598119, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3916302527", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:13:19 -0400 (0:00:01.279) 0:09:40.439 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:267 Monday 20 April 2026 05:13:19 -0400 (0:00:00.285) 0:09:40.725 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:13:20 -0400 (0:00:00.425) 0:09:41.151 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:13:20 -0400 (0:00:00.185) 0:09:41.336 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:13:20 -0400 (0:00:00.184) 0:09:41.521 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:13:21 -0400 (0:00:01.084) 0:09:42.605 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:13:21 -0400 (0:00:00.122) 0:09:42.728 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:13:22 -0400 (0:00:01.263) 0:09:43.992 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:13:23 -0400 (0:00:00.424) 0:09:44.416 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:13:23 -0400 (0:00:00.134) 0:09:44.550 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:13:23 -0400 (0:00:00.132) 0:09:44.683 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:13:23 -0400 (0:00:00.081) 0:09:44.765 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:13:23 -0400 (0:00:00.052) 0:09:44.817 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:13:23 -0400 (0:00:00.154) 0:09:44.972 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:13:24 -0400 (0:00:00.212) 0:09:45.185 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:13:24 -0400 (0:00:00.078) 0:09:45.263 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:13:27 -0400 (0:00:03.546) 0:09:48.810 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:13:27 -0400 (0:00:00.209) 0:09:49.020 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:13:28 -0400 (0:00:00.210) 0:09:49.230 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:13:33 -0400 (0:00:05.356) 0:09:54.587 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:13:33 -0400 (0:00:00.270) 0:09:54.858 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:13:33 -0400 (0:00:00.127) 0:09:54.986 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:13:34 -0400 (0:00:00.188) 0:09:55.175 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:13:34 -0400 (0:00:00.097) 0:09:55.272 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:13:38 -0400 (0:00:04.238) 0:09:59.511 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service": { "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service": { "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:13:41 -0400 (0:00:02.761) 0:10:02.272 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d082dbd3e\x2d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-082dbd3e-5ddd-4519-a194-5805ce58d144 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-082dbd3e-5ddd-4519-a194-5805ce58d144 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:13:08 EDT", "StateChangeTimestampMonotonic": "2326489448", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:13:44 -0400 (0:00:03.429) 0:10:05.701 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:13:50 -0400 (0:00:05.698) 0:10:11.400 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:13:50 -0400 (0:00:00.274) 0:10:11.675 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676298.413947, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "efcfb69b0c20da9a9c3196b904f6f2c595034629", "ctime": 1776676298.4109473, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676298.4109473, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:13:51 -0400 (0:00:01.220) 0:10:12.895 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:13:52 -0400 (0:00:01.139) 0:10:14.035 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d082dbd3e\x2d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:13:08 EDT", "StateChangeTimestampMonotonic": "2326489448", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:13:56 -0400 (0:00:03.199) 0:10:17.234 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:13:56 -0400 (0:00:00.316) 0:10:17.550 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:13:56 -0400 (0:00:00.285) 0:10:17.836 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:13:56 -0400 (0:00:00.288) 0:10:18.125 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-082dbd3e-5ddd-4519-a194-5805ce58d144" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:13:58 -0400 (0:00:01.635) 0:10:19.760 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:14:00 -0400 (0:00:01.786) 0:10:21.547 ********** changed: [managed-node2] => (item={'src': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:14:01 -0400 (0:00:01.251) 0:10:22.798 ********** skipping: [managed-node2] => (item={'src': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:14:02 -0400 (0:00:00.368) 0:10:23.166 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:14:03 -0400 (0:00:01.825) 0:10:24.992 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676312.5319157, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "95b75f64fb49f79d6f0397c184e71f6c866befbf", "ctime": 1776676304.4809337, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 268435665, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776676304.4799335, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3874953118", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:14:05 -0400 (0:00:01.259) 0:10:26.251 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:14:06 -0400 (0:00:01.145) 0:10:27.397 ********** ok: [managed-node2] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:284 Monday 20 April 2026 05:14:07 -0400 (0:00:01.618) 0:10:29.016 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:14:08 -0400 (0:00:00.534) 0:10:29.551 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:14:08 -0400 (0:00:00.214) 0:10:29.765 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:14:08 -0400 (0:00:00.208) 0:10:29.974 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "236651e8-816f-4637-b539-a3edb65ca73c" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:14:10 -0400 (0:00:01.317) 0:10:31.291 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002444", "end": "2026-04-20 05:14:11.288571", "rc": 0, "start": "2026-04-20 05:14:11.286127" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=236651e8-816f-4637-b539-a3edb65ca73c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:14:11 -0400 (0:00:01.365) 0:10:32.657 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002531", "end": "2026-04-20 05:14:12.750569", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:14:12.748038" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:14:13 -0400 (0:00:01.489) 0:10:34.146 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:14:13 -0400 (0:00:00.358) 0:10:34.505 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:14:13 -0400 (0:00:00.142) 0:10:34.647 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:14:13 -0400 (0:00:00.194) 0:10:34.842 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:14:14 -0400 (0:00:00.326) 0:10:35.168 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:14:14 -0400 (0:00:00.558) 0:10:35.727 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:14:14 -0400 (0:00:00.394) 0:10:36.121 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:14:15 -0400 (0:00:00.180) 0:10:36.301 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:14:15 -0400 (0:00:00.146) 0:10:36.447 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:14:15 -0400 (0:00:00.129) 0:10:36.576 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:14:15 -0400 (0:00:00.258) 0:10:36.835 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:14:15 -0400 (0:00:00.251) 0:10:37.086 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:14:16 -0400 (0:00:00.149) 0:10:37.236 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:14:16 -0400 (0:00:00.191) 0:10:37.428 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:14:16 -0400 (0:00:00.197) 0:10:37.626 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:14:18 -0400 (0:00:01.545) 0:10:39.171 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:14:18 -0400 (0:00:00.206) 0:10:39.378 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:14:18 -0400 (0:00:00.627) 0:10:40.005 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:14:19 -0400 (0:00:00.289) 0:10:40.294 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:14:19 -0400 (0:00:00.209) 0:10:40.504 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:14:19 -0400 (0:00:00.250) 0:10:40.754 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:14:19 -0400 (0:00:00.215) 0:10:40.970 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:14:20 -0400 (0:00:00.270) 0:10:41.240 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:14:20 -0400 (0:00:00.190) 0:10:41.431 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:14:20 -0400 (0:00:00.214) 0:10:41.645 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:14:20 -0400 (0:00:00.196) 0:10:41.841 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:14:20 -0400 (0:00:00.266) 0:10:42.108 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:14:21 -0400 (0:00:00.231) 0:10:42.339 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:14:21 -0400 (0:00:00.147) 0:10:42.487 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:14:21 -0400 (0:00:00.442) 0:10:42.930 ********** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:14:22 -0400 (0:00:00.233) 0:10:43.163 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:14:22 -0400 (0:00:00.511) 0:10:43.675 ********** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:14:22 -0400 (0:00:00.333) 0:10:44.018 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:14:23 -0400 (0:00:00.234) 0:10:44.252 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:14:23 -0400 (0:00:00.148) 0:10:44.401 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:14:23 -0400 (0:00:00.141) 0:10:44.543 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:14:23 -0400 (0:00:00.162) 0:10:44.705 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:14:23 -0400 (0:00:00.263) 0:10:44.969 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:14:24 -0400 (0:00:00.356) 0:10:45.325 ********** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:14:24 -0400 (0:00:00.257) 0:10:45.583 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:14:24 -0400 (0:00:00.440) 0:10:46.024 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:14:25 -0400 (0:00:00.228) 0:10:46.252 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:14:25 -0400 (0:00:00.218) 0:10:46.471 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:14:25 -0400 (0:00:00.239) 0:10:46.710 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:14:25 -0400 (0:00:00.232) 0:10:46.943 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:14:26 -0400 (0:00:00.283) 0:10:47.227 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:14:26 -0400 (0:00:00.356) 0:10:47.583 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:14:26 -0400 (0:00:00.159) 0:10:47.743 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:14:26 -0400 (0:00:00.193) 0:10:47.937 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:14:27 -0400 (0:00:00.306) 0:10:48.244 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:14:27 -0400 (0:00:00.260) 0:10:48.504 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:14:28 -0400 (0:00:01.043) 0:10:49.547 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:14:28 -0400 (0:00:00.451) 0:10:49.999 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:14:29 -0400 (0:00:00.201) 0:10:50.201 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:14:29 -0400 (0:00:00.245) 0:10:50.446 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:14:29 -0400 (0:00:00.169) 0:10:50.616 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:14:29 -0400 (0:00:00.154) 0:10:50.771 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:14:29 -0400 (0:00:00.213) 0:10:50.984 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:14:30 -0400 (0:00:00.250) 0:10:51.234 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:14:30 -0400 (0:00:00.174) 0:10:51.409 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:14:30 -0400 (0:00:00.232) 0:10:51.641 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:14:30 -0400 (0:00:00.215) 0:10:51.857 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:14:30 -0400 (0:00:00.167) 0:10:52.025 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=236651e8-816f-4637-b539-a3edb65ca73c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:14:31 -0400 (0:00:00.506) 0:10:52.531 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:14:31 -0400 (0:00:00.272) 0:10:52.804 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:14:31 -0400 (0:00:00.258) 0:10:53.062 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:14:32 -0400 (0:00:00.301) 0:10:53.364 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:14:32 -0400 (0:00:00.332) 0:10:53.696 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:14:32 -0400 (0:00:00.184) 0:10:53.881 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:14:33 -0400 (0:00:00.432) 0:10:54.313 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:14:33 -0400 (0:00:00.370) 0:10:54.684 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676429.9756522, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676429.9756522, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217220, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776676429.9756522, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:14:34 -0400 (0:00:01.276) 0:10:55.960 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:14:35 -0400 (0:00:00.194) 0:10:56.155 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:14:35 -0400 (0:00:00.368) 0:10:56.523 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:14:35 -0400 (0:00:00.193) 0:10:56.717 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:14:35 -0400 (0:00:00.237) 0:10:56.955 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:14:35 -0400 (0:00:00.178) 0:10:57.133 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:14:36 -0400 (0:00:00.256) 0:10:57.390 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:14:36 -0400 (0:00:00.227) 0:10:57.617 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:14:40 -0400 (0:00:04.265) 0:11:01.883 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:14:40 -0400 (0:00:00.248) 0:11:02.131 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:14:41 -0400 (0:00:00.220) 0:11:02.351 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:14:41 -0400 (0:00:00.246) 0:11:02.598 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:14:41 -0400 (0:00:00.221) 0:11:02.819 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:14:41 -0400 (0:00:00.266) 0:11:03.086 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:14:42 -0400 (0:00:00.223) 0:11:03.309 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:14:42 -0400 (0:00:00.196) 0:11:03.506 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:14:42 -0400 (0:00:00.232) 0:11:03.738 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:14:42 -0400 (0:00:00.276) 0:11:04.014 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:14:43 -0400 (0:00:00.249) 0:11:04.263 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:14:43 -0400 (0:00:00.219) 0:11:04.482 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:14:43 -0400 (0:00:00.248) 0:11:04.731 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:14:43 -0400 (0:00:00.214) 0:11:04.946 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:14:43 -0400 (0:00:00.147) 0:11:05.094 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:14:44 -0400 (0:00:00.199) 0:11:05.293 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:14:44 -0400 (0:00:00.259) 0:11:05.553 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:14:44 -0400 (0:00:00.196) 0:11:05.749 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:14:44 -0400 (0:00:00.379) 0:11:06.128 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:14:45 -0400 (0:00:00.336) 0:11:06.465 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:14:45 -0400 (0:00:00.245) 0:11:06.711 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:14:45 -0400 (0:00:00.185) 0:11:06.896 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:14:46 -0400 (0:00:00.274) 0:11:07.171 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:14:46 -0400 (0:00:00.345) 0:11:07.516 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:14:46 -0400 (0:00:00.210) 0:11:07.727 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:14:46 -0400 (0:00:00.242) 0:11:07.970 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:14:47 -0400 (0:00:00.323) 0:11:08.293 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:14:47 -0400 (0:00:00.233) 0:11:08.526 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:14:47 -0400 (0:00:00.184) 0:11:08.711 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:14:47 -0400 (0:00:00.271) 0:11:08.982 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:14:48 -0400 (0:00:00.372) 0:11:09.354 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:14:48 -0400 (0:00:00.415) 0:11:09.770 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:14:48 -0400 (0:00:00.253) 0:11:10.023 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:14:49 -0400 (0:00:00.279) 0:11:10.302 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:14:49 -0400 (0:00:00.245) 0:11:10.548 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:14:49 -0400 (0:00:00.352) 0:11:10.901 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:14:50 -0400 (0:00:00.308) 0:11:11.210 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:14:50 -0400 (0:00:00.365) 0:11:11.575 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:14:50 -0400 (0:00:00.172) 0:11:11.748 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:14:50 -0400 (0:00:00.208) 0:11:11.956 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:14:51 -0400 (0:00:00.230) 0:11:12.187 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:14:51 -0400 (0:00:00.290) 0:11:12.478 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:14:51 -0400 (0:00:00.310) 0:11:12.789 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:14:51 -0400 (0:00:00.351) 0:11:13.140 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:14:52 -0400 (0:00:00.272) 0:11:13.412 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:14:52 -0400 (0:00:00.249) 0:11:13.662 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:14:52 -0400 (0:00:00.209) 0:11:13.872 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:14:53 -0400 (0:00:00.296) 0:11:14.168 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:14:53 -0400 (0:00:00.217) 0:11:14.385 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:14:53 -0400 (0:00:00.216) 0:11:14.602 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:14:53 -0400 (0:00:00.301) 0:11:14.903 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:14:54 -0400 (0:00:00.296) 0:11:15.200 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:14:54 -0400 (0:00:00.276) 0:11:15.477 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:14:54 -0400 (0:00:00.237) 0:11:15.714 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:14:54 -0400 (0:00:00.177) 0:11:15.891 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:14:55 -0400 (0:00:00.287) 0:11:16.179 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:14:55 -0400 (0:00:00.191) 0:11:16.371 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:14:55 -0400 (0:00:00.403) 0:11:16.774 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:14:55 -0400 (0:00:00.157) 0:11:16.931 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:14:55 -0400 (0:00:00.124) 0:11:17.056 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:14:56 -0400 (0:00:00.155) 0:11:17.211 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:14:56 -0400 (0:00:00.251) 0:11:17.463 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Monday 20 April 2026 05:14:58 -0400 (0:00:01.705) 0:11:19.169 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:14:58 -0400 (0:00:00.607) 0:11:19.776 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:14:58 -0400 (0:00:00.222) 0:11:19.998 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:14:59 -0400 (0:00:00.330) 0:11:20.329 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:14:59 -0400 (0:00:00.280) 0:11:20.609 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:14:59 -0400 (0:00:00.403) 0:11:21.012 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:15:01 -0400 (0:00:01.803) 0:11:22.816 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:15:01 -0400 (0:00:00.275) 0:11:23.091 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:15:04 -0400 (0:00:02.199) 0:11:25.290 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:15:04 -0400 (0:00:00.419) 0:11:25.710 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:15:05 -0400 (0:00:00.506) 0:11:26.217 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:15:05 -0400 (0:00:00.267) 0:11:26.484 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:15:05 -0400 (0:00:00.162) 0:11:26.646 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:15:05 -0400 (0:00:00.308) 0:11:26.955 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:15:06 -0400 (0:00:00.594) 0:11:27.549 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:15:06 -0400 (0:00:00.286) 0:11:27.835 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:15:07 -0400 (0:00:00.326) 0:11:28.162 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:15:11 -0400 (0:00:04.581) 0:11:32.744 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:15:11 -0400 (0:00:00.348) 0:11:33.092 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:15:12 -0400 (0:00:00.212) 0:11:33.305 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:15:17 -0400 (0:00:05.228) 0:11:38.533 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:15:17 -0400 (0:00:00.214) 0:11:38.748 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:15:17 -0400 (0:00:00.156) 0:11:38.904 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:15:17 -0400 (0:00:00.181) 0:11:39.086 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:15:18 -0400 (0:00:00.124) 0:11:39.211 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:15:21 -0400 (0:00:03.799) 0:11:43.010 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service": { "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service": { "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:15:24 -0400 (0:00:02.375) 0:11:45.385 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d082dbd3e\x2d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-082dbd3e-5ddd-4519-a194-5805ce58d144", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-082dbd3e-5ddd-4519-a194-5805ce58d144 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-082dbd3e-5ddd-4519-a194-5805ce58d144 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:13:08 EDT", "StateChangeTimestampMonotonic": "2326489448", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:15:28 -0400 (0:00:04.197) 0:11:49.583 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:15:34 -0400 (0:00:05.642) 0:11:55.225 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:15:34 -0400 (0:00:00.273) 0:11:55.499 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d082dbd3e\x2d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d082dbd3e\\x2d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d5ddd\x2d4519\x2da194\x2d5805ce58d144.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "name": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d5ddd\\x2d4519\\x2da194\\x2d5805ce58d144.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:15:38 -0400 (0:00:03.689) 0:11:59.189 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:15:38 -0400 (0:00:00.351) 0:11:59.540 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:15:38 -0400 (0:00:00.333) 0:11:59.873 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:15:39 -0400 (0:00:00.315) 0:12:00.188 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676497.7935002, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676497.7935002, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776676497.7935002, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3978971280", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:15:40 -0400 (0:00:01.674) 0:12:01.863 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:317 Monday 20 April 2026 05:15:40 -0400 (0:00:00.219) 0:12:02.083 ********** ok: [managed-node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testtvjfugcjlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:324 Monday 20 April 2026 05:15:44 -0400 (0:00:03.165) 0:12:05.248 ********** ok: [managed-node2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testtvjfugcjlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776676544.3919652-202268-99865851847052/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:331 Monday 20 April 2026 05:15:47 -0400 (0:00:03.545) 0:12:08.794 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:15:47 -0400 (0:00:00.241) 0:12:09.036 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:15:48 -0400 (0:00:00.114) 0:12:09.150 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:15:48 -0400 (0:00:00.143) 0:12:09.294 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:15:49 -0400 (0:00:01.577) 0:12:10.871 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:15:49 -0400 (0:00:00.155) 0:12:11.026 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:15:51 -0400 (0:00:01.675) 0:12:12.702 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:15:51 -0400 (0:00:00.321) 0:12:13.023 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:15:52 -0400 (0:00:00.174) 0:12:13.198 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:15:52 -0400 (0:00:00.140) 0:12:13.338 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:15:52 -0400 (0:00:00.154) 0:12:13.493 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:15:52 -0400 (0:00:00.172) 0:12:13.665 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:15:52 -0400 (0:00:00.384) 0:12:14.050 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:15:53 -0400 (0:00:00.329) 0:12:14.380 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:15:53 -0400 (0:00:00.196) 0:12:14.576 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:15:57 -0400 (0:00:04.296) 0:12:18.872 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:15:58 -0400 (0:00:00.319) 0:12:19.192 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:15:58 -0400 (0:00:00.146) 0:12:19.339 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:16:03 -0400 (0:00:05.261) 0:12:24.600 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:16:03 -0400 (0:00:00.400) 0:12:25.001 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:16:03 -0400 (0:00:00.041) 0:12:25.042 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:16:04 -0400 (0:00:00.150) 0:12:25.193 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:16:04 -0400 (0:00:00.130) 0:12:25.324 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:16:07 -0400 (0:00:03.595) 0:12:28.919 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:16:10 -0400 (0:00:02.864) 0:12:31.784 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:16:10 -0400 (0:00:00.284) 0:12:32.069 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "/tmp/storage_testtvjfugcjlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:16:26 -0400 (0:00:15.697) 0:12:47.766 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:16:26 -0400 (0:00:00.276) 0:12:48.043 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676441.4286265, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1c3f34e3d7d9368dcba1bbf2e2be90509ee80400", "ctime": 1776676441.4256265, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676441.4256265, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:16:28 -0400 (0:00:01.572) 0:12:49.616 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:16:30 -0400 (0:00:01.590) 0:12:51.206 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:16:30 -0400 (0:00:00.351) 0:12:51.557 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "/tmp/storage_testtvjfugcjlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:16:30 -0400 (0:00:00.362) 0:12:51.920 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:16:30 -0400 (0:00:00.212) 0:12:52.133 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:16:31 -0400 (0:00:00.202) 0:12:52.335 ********** changed: [managed-node2] => (item={'src': 'UUID=236651e8-816f-4637-b539-a3edb65ca73c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=236651e8-816f-4637-b539-a3edb65ca73c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:16:32 -0400 (0:00:01.659) 0:12:53.994 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:16:34 -0400 (0:00:02.017) 0:12:56.012 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:16:36 -0400 (0:00:01.608) 0:12:57.620 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:16:36 -0400 (0:00:00.166) 0:12:57.786 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:16:38 -0400 (0:00:01.663) 0:12:59.450 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676452.7486012, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676446.062616, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 440402053, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776676446.060616, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2566953472", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:16:39 -0400 (0:00:01.484) 0:13:00.934 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-1c22551b-e2b2-491a-a335-062f6b859ca3', 'password': '/tmp/storage_testtvjfugcjlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "/tmp/storage_testtvjfugcjlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:16:41 -0400 (0:00:01.579) 0:13:02.514 ********** ok: [managed-node2] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:348 Monday 20 April 2026 05:16:42 -0400 (0:00:01.491) 0:13:04.005 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:16:43 -0400 (0:00:00.227) 0:13:04.233 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:16:43 -0400 (0:00:00.083) 0:13:04.316 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:16:43 -0400 (0:00:00.071) 0:13:04.388 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "size": "4G", "type": "crypt", "uuid": "143f207f-663c-476c-83c9-4d3f8de6df13" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "1c22551b-e2b2-491a-a335-062f6b859ca3" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:16:44 -0400 (0:00:01.217) 0:13:05.606 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002610", "end": "2026-04-20 05:16:45.457798", "rc": 0, "start": "2026-04-20 05:16:45.455188" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:16:45 -0400 (0:00:01.343) 0:13:06.949 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002188", "end": "2026-04-20 05:16:47.033077", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:16:47.030889" } STDOUT: luks-1c22551b-e2b2-491a-a335-062f6b859ca3 /dev/sda1 /tmp/storage_testtvjfugcjlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:16:47 -0400 (0:00:01.387) 0:13:08.336 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:16:47 -0400 (0:00:00.322) 0:13:08.659 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:16:47 -0400 (0:00:00.186) 0:13:08.845 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:16:47 -0400 (0:00:00.214) 0:13:09.060 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:16:48 -0400 (0:00:00.200) 0:13:09.260 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:16:48 -0400 (0:00:00.389) 0:13:09.649 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:16:48 -0400 (0:00:00.135) 0:13:09.785 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:16:48 -0400 (0:00:00.163) 0:13:09.948 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:16:48 -0400 (0:00:00.176) 0:13:10.125 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:16:49 -0400 (0:00:00.115) 0:13:10.240 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:16:49 -0400 (0:00:00.241) 0:13:10.481 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:16:49 -0400 (0:00:00.245) 0:13:10.727 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:16:49 -0400 (0:00:00.295) 0:13:11.022 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:16:50 -0400 (0:00:00.237) 0:13:11.260 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:16:50 -0400 (0:00:00.230) 0:13:11.490 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:16:51 -0400 (0:00:01.457) 0:13:12.948 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:16:51 -0400 (0:00:00.132) 0:13:13.081 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:16:52 -0400 (0:00:00.173) 0:13:13.254 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:16:52 -0400 (0:00:00.141) 0:13:13.396 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:16:52 -0400 (0:00:00.155) 0:13:13.551 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:16:52 -0400 (0:00:00.167) 0:13:13.719 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:16:52 -0400 (0:00:00.130) 0:13:13.850 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:16:52 -0400 (0:00:00.197) 0:13:14.048 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:16:53 -0400 (0:00:00.184) 0:13:14.232 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:16:53 -0400 (0:00:00.127) 0:13:14.360 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:16:53 -0400 (0:00:00.248) 0:13:14.608 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:16:53 -0400 (0:00:00.263) 0:13:14.872 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:16:53 -0400 (0:00:00.250) 0:13:15.122 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:16:54 -0400 (0:00:00.188) 0:13:15.311 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:16:54 -0400 (0:00:00.391) 0:13:15.702 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvjfugcjlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:16:54 -0400 (0:00:00.310) 0:13:16.013 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:16:55 -0400 (0:00:00.395) 0:13:16.408 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvjfugcjlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:16:55 -0400 (0:00:00.251) 0:13:16.660 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:16:55 -0400 (0:00:00.309) 0:13:16.969 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:16:56 -0400 (0:00:00.233) 0:13:17.202 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:16:56 -0400 (0:00:00.108) 0:13:17.310 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:16:56 -0400 (0:00:00.152) 0:13:17.463 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:16:56 -0400 (0:00:00.224) 0:13:17.687 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:16:56 -0400 (0:00:00.357) 0:13:18.045 ********** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testtvjfugcjlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testtvjfugcjlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:16:57 -0400 (0:00:00.283) 0:13:18.329 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:16:57 -0400 (0:00:00.413) 0:13:18.742 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:16:57 -0400 (0:00:00.114) 0:13:18.857 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:16:57 -0400 (0:00:00.277) 0:13:19.134 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:16:58 -0400 (0:00:00.252) 0:13:19.386 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:16:58 -0400 (0:00:00.247) 0:13:19.633 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:16:58 -0400 (0:00:00.216) 0:13:19.850 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:16:58 -0400 (0:00:00.153) 0:13:20.004 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:16:58 -0400 (0:00:00.134) 0:13:20.138 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:16:59 -0400 (0:00:00.036) 0:13:20.175 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:16:59 -0400 (0:00:00.288) 0:13:20.464 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:16:59 -0400 (0:00:00.196) 0:13:20.660 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:17:00 -0400 (0:00:01.070) 0:13:21.731 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:17:00 -0400 (0:00:00.180) 0:13:21.911 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:17:01 -0400 (0:00:00.259) 0:13:22.170 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:17:01 -0400 (0:00:00.311) 0:13:22.482 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:17:01 -0400 (0:00:00.281) 0:13:22.764 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:17:01 -0400 (0:00:00.220) 0:13:22.985 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:17:02 -0400 (0:00:00.301) 0:13:23.287 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:17:02 -0400 (0:00:00.192) 0:13:23.479 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:17:02 -0400 (0:00:00.181) 0:13:23.661 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:17:02 -0400 (0:00:00.187) 0:13:23.849 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:17:03 -0400 (0:00:00.345) 0:13:24.195 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:17:03 -0400 (0:00:00.137) 0:13:24.333 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:17:03 -0400 (0:00:00.518) 0:13:24.852 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:17:03 -0400 (0:00:00.234) 0:13:25.087 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:17:04 -0400 (0:00:00.168) 0:13:25.256 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:17:04 -0400 (0:00:00.163) 0:13:25.419 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:17:04 -0400 (0:00:00.224) 0:13:25.643 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:17:04 -0400 (0:00:00.211) 0:13:25.855 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:17:04 -0400 (0:00:00.281) 0:13:26.136 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:17:05 -0400 (0:00:00.160) 0:13:26.297 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676586.135302, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676586.135302, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217220, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776676586.135302, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:17:06 -0400 (0:00:01.182) 0:13:27.479 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:17:06 -0400 (0:00:00.184) 0:13:27.664 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:17:06 -0400 (0:00:00.248) 0:13:27.912 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:17:06 -0400 (0:00:00.211) 0:13:28.124 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:17:07 -0400 (0:00:00.259) 0:13:28.383 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:17:07 -0400 (0:00:00.134) 0:13:28.517 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:17:07 -0400 (0:00:00.187) 0:13:28.705 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676586.2893016, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676586.2893016, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 252193, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676586.2893016, "nlink": 1, "path": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:17:09 -0400 (0:00:01.658) 0:13:30.364 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:17:13 -0400 (0:00:03.865) 0:13:34.229 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010703", "end": "2026-04-20 05:17:14.053564", "rc": 0, "start": "2026-04-20 05:17:14.042861" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 1c22551b-e2b2-491a-a335-062f6b859ca3 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 918023 Threads: 2 Salt: b1 09 97 58 c2 b4 e7 e5 76 95 f3 4f 77 81 98 6a 5e c5 2a 53 77 62 cc c2 02 c1 ea a7 0e f0 d8 af AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 19 0e 5a 6c 78 e1 3e ff 06 85 fa 8d 22 d2 ef 53 82 69 dc 4f 09 ce 04 cd 26 96 37 c8 a6 96 c4 9d Digest: 15 ec 30 bc 46 f6 11 4b fe de ee e7 93 5d 92 95 d3 03 33 26 81 65 42 c3 6c b2 d6 6b 71 35 24 7f TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:17:14 -0400 (0:00:01.169) 0:13:35.399 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:17:14 -0400 (0:00:00.184) 0:13:35.583 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:17:14 -0400 (0:00:00.230) 0:13:35.814 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:17:14 -0400 (0:00:00.134) 0:13:35.948 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:17:15 -0400 (0:00:00.267) 0:13:36.216 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:17:15 -0400 (0:00:00.283) 0:13:36.499 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:17:15 -0400 (0:00:00.201) 0:13:36.701 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:17:15 -0400 (0:00:00.123) 0:13:36.825 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-1c22551b-e2b2-491a-a335-062f6b859ca3 /dev/sda1 /tmp/storage_testtvjfugcjlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testtvjfugcjlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:17:15 -0400 (0:00:00.245) 0:13:37.071 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:17:16 -0400 (0:00:00.171) 0:13:37.242 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:17:16 -0400 (0:00:00.153) 0:13:37.396 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:17:16 -0400 (0:00:00.221) 0:13:37.618 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:17:16 -0400 (0:00:00.175) 0:13:37.793 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:17:16 -0400 (0:00:00.169) 0:13:37.963 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:17:17 -0400 (0:00:00.234) 0:13:38.198 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:17:17 -0400 (0:00:00.221) 0:13:38.420 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:17:17 -0400 (0:00:00.282) 0:13:38.702 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:17:17 -0400 (0:00:00.153) 0:13:38.855 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:17:17 -0400 (0:00:00.102) 0:13:38.958 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:17:17 -0400 (0:00:00.129) 0:13:39.088 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:17:17 -0400 (0:00:00.033) 0:13:39.121 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:17:18 -0400 (0:00:00.059) 0:13:39.180 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:17:18 -0400 (0:00:00.265) 0:13:39.446 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:17:18 -0400 (0:00:00.207) 0:13:39.654 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:17:18 -0400 (0:00:00.246) 0:13:39.900 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:17:18 -0400 (0:00:00.181) 0:13:40.081 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:17:19 -0400 (0:00:00.104) 0:13:40.186 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:17:19 -0400 (0:00:00.250) 0:13:40.436 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:17:19 -0400 (0:00:00.211) 0:13:40.648 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:17:19 -0400 (0:00:00.234) 0:13:40.882 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:17:19 -0400 (0:00:00.205) 0:13:41.088 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:17:20 -0400 (0:00:00.256) 0:13:41.344 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:17:20 -0400 (0:00:00.221) 0:13:41.565 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:17:20 -0400 (0:00:00.257) 0:13:41.822 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:17:20 -0400 (0:00:00.218) 0:13:42.041 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:17:21 -0400 (0:00:00.280) 0:13:42.321 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:17:21 -0400 (0:00:00.247) 0:13:42.569 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:17:21 -0400 (0:00:00.186) 0:13:42.756 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:17:21 -0400 (0:00:00.254) 0:13:43.010 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:17:22 -0400 (0:00:00.329) 0:13:43.339 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:17:22 -0400 (0:00:00.321) 0:13:43.660 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:17:22 -0400 (0:00:00.323) 0:13:43.984 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:17:23 -0400 (0:00:00.308) 0:13:44.293 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:17:23 -0400 (0:00:00.319) 0:13:44.613 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:17:23 -0400 (0:00:00.242) 0:13:44.855 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:17:23 -0400 (0:00:00.221) 0:13:45.077 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:17:24 -0400 (0:00:00.265) 0:13:45.342 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:17:24 -0400 (0:00:00.226) 0:13:45.569 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:17:25 -0400 (0:00:00.604) 0:13:46.174 ********** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:17:25 -0400 (0:00:00.187) 0:13:46.361 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:17:25 -0400 (0:00:00.231) 0:13:46.593 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:17:25 -0400 (0:00:00.135) 0:13:46.728 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:17:25 -0400 (0:00:00.236) 0:13:46.964 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:17:25 -0400 (0:00:00.160) 0:13:47.125 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:17:26 -0400 (0:00:00.275) 0:13:47.401 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:17:26 -0400 (0:00:00.269) 0:13:47.670 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:17:26 -0400 (0:00:00.233) 0:13:47.903 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:17:26 -0400 (0:00:00.139) 0:13:48.043 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:17:27 -0400 (0:00:00.259) 0:13:48.303 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:17:27 -0400 (0:00:00.196) 0:13:48.499 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:351 Monday 20 April 2026 05:17:27 -0400 (0:00:00.242) 0:13:48.742 ********** ok: [managed-node2] => { "changed": false, "path": "/tmp/storage_testtvjfugcjlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:361 Monday 20 April 2026 05:17:28 -0400 (0:00:01.320) 0:13:50.062 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:17:29 -0400 (0:00:00.236) 0:13:50.299 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:17:29 -0400 (0:00:00.183) 0:13:50.483 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:17:29 -0400 (0:00:00.224) 0:13:50.707 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:17:29 -0400 (0:00:00.120) 0:13:50.828 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:17:29 -0400 (0:00:00.203) 0:13:51.031 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:17:31 -0400 (0:00:01.596) 0:13:52.628 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:17:31 -0400 (0:00:00.169) 0:13:52.797 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:17:33 -0400 (0:00:01.850) 0:13:54.647 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:17:34 -0400 (0:00:00.589) 0:13:55.237 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:17:34 -0400 (0:00:00.297) 0:13:55.535 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:17:34 -0400 (0:00:00.163) 0:13:55.698 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:17:34 -0400 (0:00:00.102) 0:13:55.801 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:17:34 -0400 (0:00:00.173) 0:13:55.975 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:17:35 -0400 (0:00:00.425) 0:13:56.400 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:17:35 -0400 (0:00:00.173) 0:13:56.574 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:17:35 -0400 (0:00:00.197) 0:13:56.771 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:17:39 -0400 (0:00:04.082) 0:14:00.854 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:17:39 -0400 (0:00:00.254) 0:14:01.109 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:17:40 -0400 (0:00:00.276) 0:14:01.385 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:17:45 -0400 (0:00:05.574) 0:14:06.960 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:17:46 -0400 (0:00:00.445) 0:14:07.406 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:17:46 -0400 (0:00:00.163) 0:14:07.569 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:17:46 -0400 (0:00:00.241) 0:14:07.810 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:17:46 -0400 (0:00:00.158) 0:14:07.969 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:17:51 -0400 (0:00:04.309) 0:14:12.278 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:17:54 -0400 (0:00:02.942) 0:14:15.220 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:17:54 -0400 (0:00:00.356) 0:14:15.577 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:17:59 -0400 (0:00:05.475) 0:14:21.052 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:18:00 -0400 (0:00:00.274) 0:14:21.327 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:18:00 -0400 (0:00:00.288) 0:14:21.616 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:18:00 -0400 (0:00:00.117) 0:14:21.733 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:18:00 -0400 (0:00:00.231) 0:14:21.965 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:380 Monday 20 April 2026 05:18:00 -0400 (0:00:00.142) 0:14:22.107 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:18:01 -0400 (0:00:00.173) 0:14:22.281 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:18:01 -0400 (0:00:00.258) 0:14:22.539 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:18:01 -0400 (0:00:00.298) 0:14:22.838 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:18:03 -0400 (0:00:01.643) 0:14:24.481 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:18:03 -0400 (0:00:00.303) 0:14:24.784 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:18:05 -0400 (0:00:01.912) 0:14:26.697 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:18:05 -0400 (0:00:00.439) 0:14:27.137 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:18:06 -0400 (0:00:00.250) 0:14:27.388 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:18:06 -0400 (0:00:00.112) 0:14:27.501 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:18:06 -0400 (0:00:00.115) 0:14:27.616 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:18:06 -0400 (0:00:00.121) 0:14:27.738 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:18:06 -0400 (0:00:00.264) 0:14:28.003 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:18:07 -0400 (0:00:00.196) 0:14:28.199 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:18:07 -0400 (0:00:00.184) 0:14:28.384 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:18:11 -0400 (0:00:04.264) 0:14:32.648 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:18:11 -0400 (0:00:00.271) 0:14:32.920 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:18:11 -0400 (0:00:00.184) 0:14:33.104 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:18:16 -0400 (0:00:04.885) 0:14:37.990 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:18:17 -0400 (0:00:00.190) 0:14:38.180 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:18:17 -0400 (0:00:00.195) 0:14:38.376 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:18:17 -0400 (0:00:00.204) 0:14:38.580 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:18:17 -0400 (0:00:00.196) 0:14:38.777 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:18:21 -0400 (0:00:03.747) 0:14:42.524 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:18:23 -0400 (0:00:02.362) 0:14:44.887 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:18:24 -0400 (0:00:00.293) 0:14:45.180 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:18:39 -0400 (0:00:15.555) 0:15:00.736 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:18:39 -0400 (0:00:00.136) 0:15:00.872 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676596.1692796, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "253e36ee1083cf7561a0df0e233823374a605c39", "ctime": 1776676596.1662796, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676596.1662796, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:18:40 -0400 (0:00:01.039) 0:15:01.912 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:18:42 -0400 (0:00:01.387) 0:15:03.299 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:18:42 -0400 (0:00:00.266) 0:15:03.565 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:18:42 -0400 (0:00:00.302) 0:15:03.868 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:18:42 -0400 (0:00:00.228) 0:15:04.096 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:18:43 -0400 (0:00:00.198) 0:15:04.295 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1c22551b-e2b2-491a-a335-062f6b859ca3" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:18:44 -0400 (0:00:01.623) 0:15:05.918 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:18:46 -0400 (0:00:01.894) 0:15:07.813 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:18:48 -0400 (0:00:01.478) 0:15:09.292 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:18:48 -0400 (0:00:00.310) 0:15:09.602 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:18:50 -0400 (0:00:01.889) 0:15:11.492 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676607.0322552, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "224ef526b8036a7201eadf709fb2b428d5273ec0", "ctime": 1776676601.0032687, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 71303369, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776676601.0022686, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "1100132425", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:18:51 -0400 (0:00:01.478) 0:15:12.970 ********** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-1c22551b-e2b2-491a-a335-062f6b859ca3', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:18:54 -0400 (0:00:02.428) 0:15:15.399 ********** ok: [managed-node2] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:398 Monday 20 April 2026 05:18:55 -0400 (0:00:01.639) 0:15:17.038 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:18:56 -0400 (0:00:00.329) 0:15:17.368 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:18:56 -0400 (0:00:00.246) 0:15:17.615 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:18:56 -0400 (0:00:00.174) 0:15:17.789 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "8fc43477-33fc-4cea-9240-68fd98da42e7" }, "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "size": "4G", "type": "crypt", "uuid": "44b886f9-1544-407f-aa7d-bb50dcfc9446" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:18:58 -0400 (0:00:01.674) 0:15:19.463 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002243", "end": "2026-04-20 05:18:59.542227", "rc": 0, "start": "2026-04-20 05:18:59.539984" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:18:59 -0400 (0:00:01.461) 0:15:20.925 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002534", "end": "2026-04-20 05:19:01.028520", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:19:01.025986" } STDOUT: luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:19:01 -0400 (0:00:01.523) 0:15:22.448 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:19:01 -0400 (0:00:00.453) 0:15:22.902 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:19:01 -0400 (0:00:00.180) 0:15:23.082 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023647", "end": "2026-04-20 05:19:03.442587", "rc": 0, "start": "2026-04-20 05:19:03.418940" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:19:03 -0400 (0:00:01.803) 0:15:24.886 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:19:04 -0400 (0:00:00.258) 0:15:25.144 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:19:04 -0400 (0:00:00.438) 0:15:25.583 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:19:04 -0400 (0:00:00.324) 0:15:25.908 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:19:08 -0400 (0:00:03.432) 0:15:29.340 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:19:08 -0400 (0:00:00.211) 0:15:29.552 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:19:08 -0400 (0:00:00.232) 0:15:29.784 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:19:08 -0400 (0:00:00.228) 0:15:30.013 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:19:09 -0400 (0:00:00.234) 0:15:30.248 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:19:09 -0400 (0:00:00.182) 0:15:30.430 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:19:09 -0400 (0:00:00.213) 0:15:30.644 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:19:09 -0400 (0:00:00.346) 0:15:30.991 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:19:11 -0400 (0:00:01.699) 0:15:32.690 ********** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:19:11 -0400 (0:00:00.243) 0:15:32.934 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:19:12 -0400 (0:00:00.563) 0:15:33.497 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:19:12 -0400 (0:00:00.281) 0:15:33.778 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:19:12 -0400 (0:00:00.244) 0:15:34.023 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:19:13 -0400 (0:00:00.147) 0:15:34.171 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:19:13 -0400 (0:00:00.166) 0:15:34.337 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:19:13 -0400 (0:00:00.140) 0:15:34.478 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:19:13 -0400 (0:00:00.169) 0:15:34.647 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:19:13 -0400 (0:00:00.183) 0:15:34.830 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:19:13 -0400 (0:00:00.189) 0:15:35.020 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:19:14 -0400 (0:00:00.145) 0:15:35.166 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:19:14 -0400 (0:00:00.141) 0:15:35.307 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:19:14 -0400 (0:00:00.122) 0:15:35.430 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:19:14 -0400 (0:00:00.337) 0:15:35.767 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 05:19:14 -0400 (0:00:00.284) 0:15:36.052 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 05:19:15 -0400 (0:00:00.144) 0:15:36.196 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 05:19:15 -0400 (0:00:00.153) 0:15:36.349 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 05:19:15 -0400 (0:00:00.191) 0:15:36.541 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 05:19:15 -0400 (0:00:00.179) 0:15:36.720 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 05:19:15 -0400 (0:00:00.194) 0:15:36.915 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 05:19:16 -0400 (0:00:00.231) 0:15:37.147 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:19:16 -0400 (0:00:00.333) 0:15:37.480 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:19:16 -0400 (0:00:00.442) 0:15:37.923 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 05:19:17 -0400 (0:00:00.454) 0:15:38.377 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 05:19:17 -0400 (0:00:00.256) 0:15:38.634 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 05:19:17 -0400 (0:00:00.340) 0:15:38.974 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 05:19:18 -0400 (0:00:00.291) 0:15:39.265 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:19:18 -0400 (0:00:00.153) 0:15:39.419 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:19:18 -0400 (0:00:00.538) 0:15:39.957 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:19:19 -0400 (0:00:00.207) 0:15:40.164 ********** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:19:19 -0400 (0:00:00.313) 0:15:40.478 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 05:19:19 -0400 (0:00:00.413) 0:15:40.892 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 05:19:20 -0400 (0:00:00.263) 0:15:41.156 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 05:19:20 -0400 (0:00:00.323) 0:15:41.479 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 05:19:20 -0400 (0:00:00.205) 0:15:41.684 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 05:19:20 -0400 (0:00:00.278) 0:15:41.963 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 05:19:21 -0400 (0:00:00.264) 0:15:42.227 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:19:21 -0400 (0:00:00.167) 0:15:42.394 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:19:21 -0400 (0:00:00.250) 0:15:42.645 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:19:22 -0400 (0:00:00.536) 0:15:43.181 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 05:19:22 -0400 (0:00:00.575) 0:15:43.757 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 05:19:22 -0400 (0:00:00.232) 0:15:43.989 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 05:19:23 -0400 (0:00:00.334) 0:15:44.324 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 05:19:23 -0400 (0:00:00.326) 0:15:44.651 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 05:19:23 -0400 (0:00:00.337) 0:15:44.989 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 05:19:24 -0400 (0:00:00.223) 0:15:45.212 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 05:19:24 -0400 (0:00:00.225) 0:15:45.438 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:19:24 -0400 (0:00:00.225) 0:15:45.663 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:19:25 -0400 (0:00:00.561) 0:15:46.225 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:19:25 -0400 (0:00:00.203) 0:15:46.428 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:19:25 -0400 (0:00:00.292) 0:15:46.721 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:19:25 -0400 (0:00:00.281) 0:15:47.003 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:19:26 -0400 (0:00:00.256) 0:15:47.259 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:19:26 -0400 (0:00:00.234) 0:15:47.493 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:19:26 -0400 (0:00:00.238) 0:15:47.732 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:19:26 -0400 (0:00:00.152) 0:15:47.885 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:19:27 -0400 (0:00:00.267) 0:15:48.152 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:19:27 -0400 (0:00:00.334) 0:15:48.487 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:19:27 -0400 (0:00:00.342) 0:15:48.830 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:19:29 -0400 (0:00:02.221) 0:15:51.052 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:19:30 -0400 (0:00:00.308) 0:15:51.360 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:19:30 -0400 (0:00:00.328) 0:15:51.688 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:19:30 -0400 (0:00:00.319) 0:15:52.007 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:19:31 -0400 (0:00:00.207) 0:15:52.214 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:19:31 -0400 (0:00:00.190) 0:15:52.404 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:19:31 -0400 (0:00:00.207) 0:15:52.612 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:19:31 -0400 (0:00:00.249) 0:15:52.861 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:19:31 -0400 (0:00:00.211) 0:15:53.073 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:19:32 -0400 (0:00:00.270) 0:15:53.344 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:19:32 -0400 (0:00:00.270) 0:15:53.614 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:19:32 -0400 (0:00:00.320) 0:15:53.935 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:19:33 -0400 (0:00:00.462) 0:15:54.398 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:19:33 -0400 (0:00:00.187) 0:15:54.585 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:19:33 -0400 (0:00:00.254) 0:15:54.839 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:19:33 -0400 (0:00:00.200) 0:15:55.040 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:19:34 -0400 (0:00:00.235) 0:15:55.276 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:19:34 -0400 (0:00:00.166) 0:15:55.443 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:19:34 -0400 (0:00:00.271) 0:15:55.715 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:19:34 -0400 (0:00:00.387) 0:15:56.102 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676719.2250035, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676719.2250035, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 267962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676719.2250035, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:19:36 -0400 (0:00:01.665) 0:15:57.768 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:19:36 -0400 (0:00:00.229) 0:15:57.997 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:19:37 -0400 (0:00:00.317) 0:15:58.315 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:19:37 -0400 (0:00:00.308) 0:15:58.624 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:19:37 -0400 (0:00:00.220) 0:15:58.845 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:19:38 -0400 (0:00:00.346) 0:15:59.191 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:19:38 -0400 (0:00:00.149) 0:15:59.341 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676719.3660033, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676719.3660033, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268112, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676719.3660033, "nlink": 1, "path": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:19:39 -0400 (0:00:01.523) 0:16:00.864 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:19:44 -0400 (0:00:04.556) 0:16:05.420 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009758", "end": "2026-04-20 05:19:45.392533", "rc": 0, "start": "2026-04-20 05:19:45.382775" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 8fc43477-33fc-4cea-9240-68fd98da42e7 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 936262 Threads: 2 Salt: 91 a6 ed 84 f3 10 3d fc a6 bf 6b 6e 03 8f 80 a1 a1 4c 22 a5 64 c4 2e 36 49 e5 50 51 48 a8 3d 9e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 8f 58 25 c2 e2 81 2e 63 fd bb 52 0e cc 82 33 a8 76 d4 0a 61 cf a1 71 81 20 71 d9 3f 2e 7d ea e8 Digest: 50 dd 78 e2 f3 01 5e 51 25 6e b6 3a 03 8f 32 7b 6f c4 38 d6 97 57 59 f2 15 46 0b db af 81 b6 6b TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:19:45 -0400 (0:00:01.411) 0:16:06.832 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:19:46 -0400 (0:00:00.356) 0:16:07.188 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:19:46 -0400 (0:00:00.266) 0:16:07.455 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:19:46 -0400 (0:00:00.243) 0:16:07.698 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:19:46 -0400 (0:00:00.210) 0:16:07.909 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:19:47 -0400 (0:00:00.412) 0:16:08.322 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:19:47 -0400 (0:00:00.368) 0:16:08.690 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:19:47 -0400 (0:00:00.388) 0:16:09.078 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:19:48 -0400 (0:00:00.344) 0:16:09.422 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:19:48 -0400 (0:00:00.383) 0:16:09.806 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:19:48 -0400 (0:00:00.324) 0:16:10.130 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:19:49 -0400 (0:00:00.304) 0:16:10.435 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:19:49 -0400 (0:00:00.367) 0:16:10.802 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:19:49 -0400 (0:00:00.188) 0:16:10.991 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:19:50 -0400 (0:00:00.288) 0:16:11.279 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:19:50 -0400 (0:00:00.203) 0:16:11.483 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:19:50 -0400 (0:00:00.202) 0:16:11.685 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:19:50 -0400 (0:00:00.194) 0:16:11.880 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:19:50 -0400 (0:00:00.242) 0:16:12.122 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:19:51 -0400 (0:00:00.243) 0:16:12.365 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:19:51 -0400 (0:00:00.256) 0:16:12.622 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:19:51 -0400 (0:00:00.225) 0:16:12.848 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:19:51 -0400 (0:00:00.240) 0:16:13.088 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:19:52 -0400 (0:00:00.204) 0:16:13.292 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:19:55 -0400 (0:00:03.261) 0:16:16.554 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:19:56 -0400 (0:00:01.523) 0:16:18.078 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:19:57 -0400 (0:00:00.400) 0:16:18.478 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:19:57 -0400 (0:00:00.318) 0:16:18.796 ********** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:19:59 -0400 (0:00:01.785) 0:16:20.582 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:19:59 -0400 (0:00:00.283) 0:16:20.865 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:19:59 -0400 (0:00:00.228) 0:16:21.094 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:20:00 -0400 (0:00:00.191) 0:16:21.286 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:20:00 -0400 (0:00:00.183) 0:16:21.469 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:20:00 -0400 (0:00:00.255) 0:16:21.725 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:20:00 -0400 (0:00:00.217) 0:16:21.942 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:20:01 -0400 (0:00:00.261) 0:16:22.203 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:20:01 -0400 (0:00:00.272) 0:16:22.476 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:20:01 -0400 (0:00:00.149) 0:16:22.625 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:20:01 -0400 (0:00:00.155) 0:16:22.780 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:20:01 -0400 (0:00:00.131) 0:16:22.912 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:20:02 -0400 (0:00:00.340) 0:16:23.252 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:20:02 -0400 (0:00:00.260) 0:16:23.512 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:20:02 -0400 (0:00:00.248) 0:16:23.761 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:20:02 -0400 (0:00:00.265) 0:16:24.026 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:20:03 -0400 (0:00:00.222) 0:16:24.249 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:20:03 -0400 (0:00:00.307) 0:16:24.556 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:20:03 -0400 (0:00:00.203) 0:16:24.759 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:20:04 -0400 (0:00:00.412) 0:16:25.172 ********** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:20:04 -0400 (0:00:00.196) 0:16:25.368 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:20:04 -0400 (0:00:00.212) 0:16:25.580 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:20:04 -0400 (0:00:00.256) 0:16:25.837 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023928", "end": "2026-04-20 05:20:05.635503", "rc": 0, "start": "2026-04-20 05:20:05.611575" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:20:05 -0400 (0:00:01.079) 0:16:26.917 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:20:05 -0400 (0:00:00.164) 0:16:27.081 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:20:06 -0400 (0:00:00.123) 0:16:27.205 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:20:06 -0400 (0:00:00.097) 0:16:27.303 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:20:06 -0400 (0:00:00.125) 0:16:27.428 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:20:06 -0400 (0:00:00.104) 0:16:27.533 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:20:06 -0400 (0:00:00.077) 0:16:27.611 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:20:06 -0400 (0:00:00.144) 0:16:27.755 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:20:06 -0400 (0:00:00.108) 0:16:27.864 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:401 Monday 20 April 2026 05:20:06 -0400 (0:00:00.187) 0:16:28.052 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:20:07 -0400 (0:00:00.181) 0:16:28.233 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:20:07 -0400 (0:00:00.146) 0:16:28.380 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:20:07 -0400 (0:00:00.128) 0:16:28.508 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:20:08 -0400 (0:00:01.455) 0:16:29.964 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:20:09 -0400 (0:00:00.213) 0:16:30.177 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:20:10 -0400 (0:00:01.845) 0:16:32.022 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:20:11 -0400 (0:00:00.469) 0:16:32.492 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:20:11 -0400 (0:00:00.263) 0:16:32.755 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:20:11 -0400 (0:00:00.293) 0:16:33.048 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:20:12 -0400 (0:00:00.216) 0:16:33.265 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:20:12 -0400 (0:00:00.152) 0:16:33.417 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:20:12 -0400 (0:00:00.328) 0:16:33.746 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:20:12 -0400 (0:00:00.184) 0:16:33.930 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:20:12 -0400 (0:00:00.119) 0:16:34.050 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:20:17 -0400 (0:00:04.095) 0:16:38.145 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:20:17 -0400 (0:00:00.246) 0:16:38.392 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:20:17 -0400 (0:00:00.213) 0:16:38.605 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:20:22 -0400 (0:00:05.229) 0:16:43.834 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:20:22 -0400 (0:00:00.246) 0:16:44.082 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:20:23 -0400 (0:00:00.130) 0:16:44.212 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:20:23 -0400 (0:00:00.143) 0:16:44.355 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:20:23 -0400 (0:00:00.152) 0:16:44.508 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:20:27 -0400 (0:00:03.796) 0:16:48.305 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service": { "name": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service": { "name": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:20:30 -0400 (0:00:03.111) 0:16:51.416 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d1c22551b\x2de2b2\x2d491a\x2da335\x2d062f6b859ca3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "name": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket cryptsetup-pre.target tmp.mount system-systemd\\x2dcryptsetup.slice -.mount dev-sda1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-1c22551b-e2b2-491a-a335-062f6b859ca3", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1c22551b-e2b2-491a-a335-062f6b859ca3 /dev/sda1 /tmp/storage_testtvjfugcjlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1c22551b-e2b2-491a-a335-062f6b859ca3 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_testtvjfugcjlukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:18:49 EDT", "StateChangeTimestampMonotonic": "2668015537", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de2b2\x2d491a\x2da335\x2d062f6b859ca3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "name": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:20:33 -0400 (0:00:03.561) 0:16:54.978 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:20:39 -0400 (0:00:05.666) 0:17:00.644 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:20:39 -0400 (0:00:00.265) 0:17:00.910 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676727.9599838, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "487c12cedec409e4b1c8a498737fe6376de786d0", "ctime": 1776676727.9569838, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676727.9569838, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:20:41 -0400 (0:00:01.321) 0:17:02.232 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:20:41 -0400 (0:00:00.191) 0:17:02.423 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d1c22551b\x2de2b2\x2d491a\x2da335\x2d062f6b859ca3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "name": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d1c22551b\\x2de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de2b2\x2d491a\x2da335\x2d062f6b859ca3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "name": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de2b2\\x2d491a\\x2da335\\x2d062f6b859ca3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:20:45 -0400 (0:00:04.502) 0:17:06.925 ********** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:20:45 -0400 (0:00:00.206) 0:17:07.132 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:20:46 -0400 (0:00:00.226) 0:17:07.358 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:20:46 -0400 (0:00:00.205) 0:17:07.564 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:20:46 -0400 (0:00:00.192) 0:17:07.756 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:20:48 -0400 (0:00:01.465) 0:17:09.222 ********** ok: [managed-node2] => (item={'src': '/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:20:48 -0400 (0:00:00.763) 0:17:09.985 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:20:48 -0400 (0:00:00.137) 0:17:10.123 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:20:50 -0400 (0:00:01.103) 0:17:11.227 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676741.0269547, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "942626ec1389b6a662a9a1ced8972a091329baff", "ctime": 1776676734.0559702, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 209715395, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776676734.0559702, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1064360692", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:20:51 -0400 (0:00:01.054) 0:17:12.281 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:20:51 -0400 (0:00:00.098) 0:17:12.379 ********** ok: [managed-node2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:414 Monday 20 April 2026 05:20:52 -0400 (0:00:01.324) 0:17:13.704 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:421 Monday 20 April 2026 05:20:52 -0400 (0:00:00.147) 0:17:13.851 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:20:52 -0400 (0:00:00.149) 0:17:14.000 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:20:52 -0400 (0:00:00.136) 0:17:14.137 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:20:53 -0400 (0:00:00.134) 0:17:14.271 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "8fc43477-33fc-4cea-9240-68fd98da42e7" }, "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "size": "4G", "type": "crypt", "uuid": "44b886f9-1544-407f-aa7d-bb50dcfc9446" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:20:54 -0400 (0:00:01.139) 0:17:15.411 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003189", "end": "2026-04-20 05:20:55.204193", "rc": 0, "start": "2026-04-20 05:20:55.201004" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:20:55 -0400 (0:00:01.264) 0:17:16.676 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002742", "end": "2026-04-20 05:20:56.464619", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:20:56.461877" } STDOUT: luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:20:56 -0400 (0:00:01.225) 0:17:17.901 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:20:57 -0400 (0:00:00.315) 0:17:18.217 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:20:57 -0400 (0:00:00.207) 0:17:18.424 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023082", "end": "2026-04-20 05:20:58.722511", "rc": 0, "start": "2026-04-20 05:20:58.699429" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:20:58 -0400 (0:00:01.640) 0:17:20.065 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:20:59 -0400 (0:00:00.297) 0:17:20.362 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:20:59 -0400 (0:00:00.267) 0:17:20.629 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:20:59 -0400 (0:00:00.318) 0:17:20.948 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:21:01 -0400 (0:00:01.344) 0:17:22.292 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:21:01 -0400 (0:00:00.176) 0:17:22.469 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:21:01 -0400 (0:00:00.252) 0:17:22.722 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:21:01 -0400 (0:00:00.253) 0:17:22.975 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:21:02 -0400 (0:00:00.316) 0:17:23.291 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:21:02 -0400 (0:00:00.135) 0:17:23.427 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:21:02 -0400 (0:00:00.130) 0:17:23.557 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:21:02 -0400 (0:00:00.315) 0:17:23.873 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:21:04 -0400 (0:00:01.283) 0:17:25.157 ********** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:21:04 -0400 (0:00:00.147) 0:17:25.304 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:21:04 -0400 (0:00:00.300) 0:17:25.605 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:21:04 -0400 (0:00:00.148) 0:17:25.754 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:21:04 -0400 (0:00:00.239) 0:17:25.993 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:21:05 -0400 (0:00:00.241) 0:17:26.235 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:21:05 -0400 (0:00:00.168) 0:17:26.404 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:21:05 -0400 (0:00:00.248) 0:17:26.652 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:21:05 -0400 (0:00:00.312) 0:17:26.964 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:21:06 -0400 (0:00:00.273) 0:17:27.238 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:21:06 -0400 (0:00:00.294) 0:17:27.533 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:21:06 -0400 (0:00:00.144) 0:17:27.677 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:21:06 -0400 (0:00:00.124) 0:17:27.802 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:21:06 -0400 (0:00:00.150) 0:17:27.952 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:21:07 -0400 (0:00:00.281) 0:17:28.234 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 05:21:07 -0400 (0:00:00.396) 0:17:28.630 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 05:21:07 -0400 (0:00:00.348) 0:17:28.979 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 05:21:08 -0400 (0:00:00.236) 0:17:29.216 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 05:21:08 -0400 (0:00:00.253) 0:17:29.469 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 05:21:08 -0400 (0:00:00.198) 0:17:29.668 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 05:21:08 -0400 (0:00:00.283) 0:17:29.951 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 05:21:09 -0400 (0:00:00.332) 0:17:30.284 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:21:09 -0400 (0:00:00.196) 0:17:30.481 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:21:09 -0400 (0:00:00.235) 0:17:30.716 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 05:21:09 -0400 (0:00:00.227) 0:17:30.944 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 05:21:09 -0400 (0:00:00.078) 0:17:31.022 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 05:21:10 -0400 (0:00:00.231) 0:17:31.254 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 05:21:10 -0400 (0:00:00.220) 0:17:31.475 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:21:10 -0400 (0:00:00.144) 0:17:31.620 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:21:11 -0400 (0:00:00.522) 0:17:32.142 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:21:11 -0400 (0:00:00.278) 0:17:32.420 ********** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:21:11 -0400 (0:00:00.169) 0:17:32.590 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 05:21:11 -0400 (0:00:00.484) 0:17:33.074 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 05:21:12 -0400 (0:00:00.196) 0:17:33.271 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 05:21:12 -0400 (0:00:00.162) 0:17:33.434 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 05:21:12 -0400 (0:00:00.069) 0:17:33.503 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 05:21:12 -0400 (0:00:00.084) 0:17:33.587 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 05:21:12 -0400 (0:00:00.117) 0:17:33.705 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:21:12 -0400 (0:00:00.155) 0:17:33.861 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:21:12 -0400 (0:00:00.132) 0:17:33.994 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:21:13 -0400 (0:00:00.310) 0:17:34.304 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 05:21:13 -0400 (0:00:00.546) 0:17:34.851 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 05:21:14 -0400 (0:00:00.535) 0:17:35.386 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 05:21:14 -0400 (0:00:00.183) 0:17:35.570 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 05:21:14 -0400 (0:00:00.169) 0:17:35.740 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 05:21:14 -0400 (0:00:00.202) 0:17:35.942 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 05:21:14 -0400 (0:00:00.165) 0:17:36.107 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 05:21:15 -0400 (0:00:00.208) 0:17:36.316 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:21:15 -0400 (0:00:00.180) 0:17:36.497 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:21:15 -0400 (0:00:00.484) 0:17:36.981 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:21:16 -0400 (0:00:00.244) 0:17:37.226 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:21:16 -0400 (0:00:00.165) 0:17:37.392 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:21:16 -0400 (0:00:00.206) 0:17:37.599 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:21:16 -0400 (0:00:00.125) 0:17:37.724 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:21:16 -0400 (0:00:00.290) 0:17:38.015 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:21:17 -0400 (0:00:00.169) 0:17:38.185 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:21:17 -0400 (0:00:00.136) 0:17:38.388 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:21:17 -0400 (0:00:00.123) 0:17:38.512 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:21:17 -0400 (0:00:00.383) 0:17:38.896 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:21:17 -0400 (0:00:00.204) 0:17:39.100 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:21:18 -0400 (0:00:00.668) 0:17:39.769 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:21:18 -0400 (0:00:00.244) 0:17:40.013 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:21:19 -0400 (0:00:00.311) 0:17:40.325 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:21:19 -0400 (0:00:00.317) 0:17:40.642 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:21:19 -0400 (0:00:00.212) 0:17:40.854 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:21:19 -0400 (0:00:00.112) 0:17:40.967 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:21:20 -0400 (0:00:00.188) 0:17:41.156 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:21:20 -0400 (0:00:00.201) 0:17:41.357 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:21:20 -0400 (0:00:00.190) 0:17:41.547 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:21:20 -0400 (0:00:00.197) 0:17:41.744 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:21:20 -0400 (0:00:00.202) 0:17:41.947 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:21:21 -0400 (0:00:00.288) 0:17:42.236 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:21:21 -0400 (0:00:00.522) 0:17:42.758 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:21:21 -0400 (0:00:00.205) 0:17:42.963 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:21:21 -0400 (0:00:00.146) 0:17:43.110 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:21:22 -0400 (0:00:00.166) 0:17:43.276 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:21:22 -0400 (0:00:00.175) 0:17:43.451 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:21:22 -0400 (0:00:00.201) 0:17:43.653 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:21:22 -0400 (0:00:00.311) 0:17:43.964 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:21:23 -0400 (0:00:00.337) 0:17:44.302 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676785.387855, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676719.2250035, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 267962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676719.2250035, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:21:24 -0400 (0:00:01.376) 0:17:45.679 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:21:24 -0400 (0:00:00.365) 0:17:46.045 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:21:25 -0400 (0:00:00.176) 0:17:46.221 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:21:25 -0400 (0:00:00.197) 0:17:46.418 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:21:25 -0400 (0:00:00.198) 0:17:46.617 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:21:25 -0400 (0:00:00.198) 0:17:46.815 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:21:26 -0400 (0:00:00.344) 0:17:47.160 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676839.2277343, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676719.3660033, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268112, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676719.3660033, "nlink": 1, "path": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:21:27 -0400 (0:00:01.141) 0:17:48.301 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:21:31 -0400 (0:00:04.009) 0:17:52.310 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010145", "end": "2026-04-20 05:21:32.442026", "rc": 0, "start": "2026-04-20 05:21:32.431881" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 8fc43477-33fc-4cea-9240-68fd98da42e7 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 936262 Threads: 2 Salt: 91 a6 ed 84 f3 10 3d fc a6 bf 6b 6e 03 8f 80 a1 a1 4c 22 a5 64 c4 2e 36 49 e5 50 51 48 a8 3d 9e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 8f 58 25 c2 e2 81 2e 63 fd bb 52 0e cc 82 33 a8 76 d4 0a 61 cf a1 71 81 20 71 d9 3f 2e 7d ea e8 Digest: 50 dd 78 e2 f3 01 5e 51 25 6e b6 3a 03 8f 32 7b 6f c4 38 d6 97 57 59 f2 15 46 0b db af 81 b6 6b TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:21:32 -0400 (0:00:01.534) 0:17:53.845 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:21:33 -0400 (0:00:00.347) 0:17:54.192 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:21:33 -0400 (0:00:00.313) 0:17:54.506 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:21:33 -0400 (0:00:00.298) 0:17:54.804 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:21:33 -0400 (0:00:00.248) 0:17:55.053 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:21:34 -0400 (0:00:00.328) 0:17:55.382 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:21:34 -0400 (0:00:00.272) 0:17:55.654 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:21:34 -0400 (0:00:00.266) 0:17:55.921 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:21:35 -0400 (0:00:00.308) 0:17:56.229 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:21:35 -0400 (0:00:00.177) 0:17:56.407 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:21:35 -0400 (0:00:00.214) 0:17:56.621 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:21:35 -0400 (0:00:00.225) 0:17:56.847 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:21:35 -0400 (0:00:00.209) 0:17:57.057 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:21:36 -0400 (0:00:00.189) 0:17:57.246 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:21:36 -0400 (0:00:00.174) 0:17:57.421 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:21:36 -0400 (0:00:00.164) 0:17:57.586 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:21:36 -0400 (0:00:00.186) 0:17:57.773 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:21:36 -0400 (0:00:00.266) 0:17:58.040 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:21:37 -0400 (0:00:00.211) 0:17:58.251 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:21:37 -0400 (0:00:00.246) 0:17:58.498 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:21:37 -0400 (0:00:00.211) 0:17:58.709 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:21:37 -0400 (0:00:00.181) 0:17:58.890 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:21:37 -0400 (0:00:00.191) 0:17:59.082 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:21:38 -0400 (0:00:00.251) 0:17:59.333 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:21:39 -0400 (0:00:01.539) 0:18:00.873 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:21:40 -0400 (0:00:01.144) 0:18:02.017 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:21:41 -0400 (0:00:00.170) 0:18:02.188 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:21:41 -0400 (0:00:00.127) 0:18:02.315 ********** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:21:42 -0400 (0:00:01.355) 0:18:03.670 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:21:42 -0400 (0:00:00.204) 0:18:03.874 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:21:42 -0400 (0:00:00.225) 0:18:04.100 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:21:43 -0400 (0:00:00.204) 0:18:04.304 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:21:43 -0400 (0:00:00.198) 0:18:04.503 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:21:43 -0400 (0:00:00.296) 0:18:04.799 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:21:43 -0400 (0:00:00.213) 0:18:05.013 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:21:44 -0400 (0:00:00.179) 0:18:05.192 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:21:44 -0400 (0:00:00.187) 0:18:05.380 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:21:44 -0400 (0:00:00.168) 0:18:05.548 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:21:44 -0400 (0:00:00.230) 0:18:05.779 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:21:44 -0400 (0:00:00.317) 0:18:06.097 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:21:45 -0400 (0:00:00.278) 0:18:06.375 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:21:45 -0400 (0:00:00.247) 0:18:06.623 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:21:45 -0400 (0:00:00.249) 0:18:06.872 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:21:45 -0400 (0:00:00.238) 0:18:07.110 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:21:46 -0400 (0:00:00.250) 0:18:07.361 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:21:46 -0400 (0:00:00.247) 0:18:07.609 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:21:46 -0400 (0:00:00.298) 0:18:07.907 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:21:46 -0400 (0:00:00.195) 0:18:08.103 ********** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:21:47 -0400 (0:00:00.195) 0:18:08.299 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:21:47 -0400 (0:00:00.143) 0:18:08.443 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:21:47 -0400 (0:00:00.218) 0:18:08.662 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024713", "end": "2026-04-20 05:21:48.244274", "rc": 0, "start": "2026-04-20 05:21:48.219561" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:21:48 -0400 (0:00:01.052) 0:18:09.714 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:21:48 -0400 (0:00:00.252) 0:18:09.966 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:21:49 -0400 (0:00:00.263) 0:18:10.230 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:21:49 -0400 (0:00:00.319) 0:18:10.549 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:21:49 -0400 (0:00:00.247) 0:18:10.797 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:21:50 -0400 (0:00:00.385) 0:18:11.182 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:21:50 -0400 (0:00:00.256) 0:18:11.439 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:21:50 -0400 (0:00:00.221) 0:18:11.660 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:21:50 -0400 (0:00:00.166) 0:18:11.827 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:21:50 -0400 (0:00:00.164) 0:18:11.991 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:427 Monday 20 April 2026 05:21:52 -0400 (0:00:01.479) 0:18:13.471 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:21:52 -0400 (0:00:00.370) 0:18:13.841 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:21:52 -0400 (0:00:00.227) 0:18:14.069 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:21:53 -0400 (0:00:00.280) 0:18:14.349 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:21:53 -0400 (0:00:00.186) 0:18:14.536 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:21:53 -0400 (0:00:00.235) 0:18:14.771 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:21:55 -0400 (0:00:01.752) 0:18:16.524 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:21:55 -0400 (0:00:00.314) 0:18:16.839 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:21:57 -0400 (0:00:01.699) 0:18:18.539 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:21:57 -0400 (0:00:00.423) 0:18:18.962 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:21:57 -0400 (0:00:00.074) 0:18:19.036 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:21:57 -0400 (0:00:00.057) 0:18:19.093 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:21:58 -0400 (0:00:00.121) 0:18:19.215 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:21:58 -0400 (0:00:00.098) 0:18:19.314 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:21:59 -0400 (0:00:01.154) 0:18:20.468 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:21:59 -0400 (0:00:00.277) 0:18:20.745 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:21:59 -0400 (0:00:00.208) 0:18:20.954 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:22:03 -0400 (0:00:03.919) 0:18:24.873 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:22:03 -0400 (0:00:00.109) 0:18:24.983 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:22:04 -0400 (0:00:00.226) 0:18:25.210 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:22:09 -0400 (0:00:05.611) 0:18:30.822 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:22:09 -0400 (0:00:00.249) 0:18:31.072 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:22:10 -0400 (0:00:00.119) 0:18:31.191 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:22:10 -0400 (0:00:00.226) 0:18:31.418 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:22:10 -0400 (0:00:00.126) 0:18:31.545 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:22:14 -0400 (0:00:04.095) 0:18:35.641 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:22:17 -0400 (0:00:03.036) 0:18:38.677 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:20:33 EDT", "StateChangeTimestampMonotonic": "2771568173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:22:20 -0400 (0:00:03.274) 0:18:41.952 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:22:26 -0400 (0:00:05.356) 0:18:47.308 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:22:26 -0400 (0:00:00.168) 0:18:47.476 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:20:33 EDT", "StateChangeTimestampMonotonic": "2771568173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:22:29 -0400 (0:00:03.217) 0:18:50.693 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:22:29 -0400 (0:00:00.252) 0:18:50.946 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:22:30 -0400 (0:00:00.331) 0:18:51.278 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:22:30 -0400 (0:00:00.322) 0:18:51.601 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676912.1065707, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776676912.1065707, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776676912.1065707, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3074750467", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:22:31 -0400 (0:00:01.468) 0:18:53.069 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:451 Monday 20 April 2026 05:22:32 -0400 (0:00:00.195) 0:18:53.264 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:22:32 -0400 (0:00:00.494) 0:18:53.758 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:22:32 -0400 (0:00:00.180) 0:18:53.939 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:22:33 -0400 (0:00:00.284) 0:18:54.223 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:22:34 -0400 (0:00:01.746) 0:18:55.970 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:22:34 -0400 (0:00:00.155) 0:18:56.126 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:22:36 -0400 (0:00:01.751) 0:18:57.877 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:22:37 -0400 (0:00:00.540) 0:18:58.418 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:22:37 -0400 (0:00:00.266) 0:18:58.684 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:22:37 -0400 (0:00:00.211) 0:18:58.896 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:22:38 -0400 (0:00:00.300) 0:18:59.197 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:22:38 -0400 (0:00:00.208) 0:18:59.405 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:22:38 -0400 (0:00:00.486) 0:18:59.891 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:22:38 -0400 (0:00:00.225) 0:19:00.116 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:22:39 -0400 (0:00:00.294) 0:19:00.410 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:22:43 -0400 (0:00:04.173) 0:19:04.584 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:22:43 -0400 (0:00:00.161) 0:19:04.746 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:22:43 -0400 (0:00:00.127) 0:19:04.873 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:22:48 -0400 (0:00:05.123) 0:19:09.997 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:22:49 -0400 (0:00:00.181) 0:19:10.178 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:22:49 -0400 (0:00:00.090) 0:19:10.269 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:22:49 -0400 (0:00:00.082) 0:19:10.351 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:22:49 -0400 (0:00:00.133) 0:19:10.485 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:22:53 -0400 (0:00:04.098) 0:19:14.584 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:22:55 -0400 (0:00:02.408) 0:19:16.993 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:20:33 EDT", "StateChangeTimestampMonotonic": "2771568173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:22:58 -0400 (0:00:03.110) 0:19:20.103 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:23:04 -0400 (0:00:05.892) 0:19:25.995 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:23:04 -0400 (0:00:00.098) 0:19:26.094 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676727.9599838, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "487c12cedec409e4b1c8a498737fe6376de786d0", "ctime": 1776676727.9569838, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676727.9569838, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:23:05 -0400 (0:00:00.639) 0:19:26.733 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:23:06 -0400 (0:00:01.326) 0:19:28.060 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:20:33 EDT", "StateChangeTimestampMonotonic": "2771568173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:23:10 -0400 (0:00:03.722) 0:19:31.782 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:23:10 -0400 (0:00:00.327) 0:19:32.110 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:23:11 -0400 (0:00:00.338) 0:19:32.448 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:23:11 -0400 (0:00:00.305) 0:19:32.754 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-8fc43477-33fc-4cea-9240-68fd98da42e7" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:23:13 -0400 (0:00:01.451) 0:19:34.205 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:23:14 -0400 (0:00:01.703) 0:19:35.909 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:23:16 -0400 (0:00:01.480) 0:19:37.389 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:23:16 -0400 (0:00:00.588) 0:19:37.978 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:23:18 -0400 (0:00:01.754) 0:19:39.732 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676741.0269547, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "942626ec1389b6a662a9a1ced8972a091329baff", "ctime": 1776676734.0559702, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 209715395, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776676734.0559702, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1064360692", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:23:20 -0400 (0:00:01.575) 0:19:41.308 ********** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:23:22 -0400 (0:00:02.342) 0:19:43.650 ********** ok: [managed-node2] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:467 Monday 20 April 2026 05:23:24 -0400 (0:00:02.170) 0:19:45.821 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:23:25 -0400 (0:00:00.400) 0:19:46.222 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:23:25 -0400 (0:00:00.254) 0:19:46.476 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:23:25 -0400 (0:00:00.163) 0:19:46.639 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "e229ca45-29e3-47f2-aa0c-3c1f85d57cf7" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:23:27 -0400 (0:00:01.844) 0:19:48.484 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002617", "end": "2026-04-20 05:23:28.779322", "rc": 0, "start": "2026-04-20 05:23:28.776705" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:23:29 -0400 (0:00:01.722) 0:19:50.206 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002599", "end": "2026-04-20 05:23:30.644475", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:23:30.641876" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:23:30 -0400 (0:00:01.799) 0:19:52.006 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:23:31 -0400 (0:00:00.336) 0:19:52.342 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:23:31 -0400 (0:00:00.195) 0:19:52.538 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022880", "end": "2026-04-20 05:23:32.592875", "rc": 0, "start": "2026-04-20 05:23:32.569995" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:23:32 -0400 (0:00:01.495) 0:19:54.033 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:23:33 -0400 (0:00:00.361) 0:19:54.395 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:23:33 -0400 (0:00:00.639) 0:19:55.034 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:23:34 -0400 (0:00:00.364) 0:19:55.399 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:23:35 -0400 (0:00:01.691) 0:19:57.090 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:23:36 -0400 (0:00:00.264) 0:19:57.355 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:23:36 -0400 (0:00:00.254) 0:19:57.609 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:23:36 -0400 (0:00:00.379) 0:19:57.988 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:23:37 -0400 (0:00:00.310) 0:19:58.299 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:23:37 -0400 (0:00:00.303) 0:19:58.603 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:23:37 -0400 (0:00:00.264) 0:19:58.867 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:23:38 -0400 (0:00:00.497) 0:19:59.365 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:23:39 -0400 (0:00:01.695) 0:20:01.061 ********** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:23:40 -0400 (0:00:00.246) 0:20:01.307 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:23:40 -0400 (0:00:00.557) 0:20:01.865 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:23:40 -0400 (0:00:00.227) 0:20:02.092 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:23:41 -0400 (0:00:00.239) 0:20:02.332 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:23:41 -0400 (0:00:00.272) 0:20:02.604 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:23:41 -0400 (0:00:00.271) 0:20:02.876 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:23:41 -0400 (0:00:00.218) 0:20:03.095 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:23:42 -0400 (0:00:00.219) 0:20:03.315 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:23:42 -0400 (0:00:00.208) 0:20:03.523 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:23:42 -0400 (0:00:00.185) 0:20:03.708 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:23:42 -0400 (0:00:00.202) 0:20:03.911 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:23:42 -0400 (0:00:00.163) 0:20:04.074 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:23:43 -0400 (0:00:00.162) 0:20:04.237 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:23:43 -0400 (0:00:00.386) 0:20:04.624 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 05:23:43 -0400 (0:00:00.329) 0:20:04.953 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 05:23:44 -0400 (0:00:00.192) 0:20:05.146 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 05:23:44 -0400 (0:00:00.257) 0:20:05.404 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 05:23:44 -0400 (0:00:00.263) 0:20:05.668 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 05:23:44 -0400 (0:00:00.219) 0:20:05.888 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 05:23:44 -0400 (0:00:00.246) 0:20:06.135 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 05:23:45 -0400 (0:00:00.218) 0:20:06.353 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:23:45 -0400 (0:00:00.315) 0:20:06.668 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:23:45 -0400 (0:00:00.411) 0:20:07.080 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 05:23:46 -0400 (0:00:00.309) 0:20:07.389 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 05:23:46 -0400 (0:00:00.331) 0:20:07.721 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 05:23:46 -0400 (0:00:00.109) 0:20:07.830 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 05:23:46 -0400 (0:00:00.221) 0:20:08.052 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:23:47 -0400 (0:00:00.255) 0:20:08.307 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:23:47 -0400 (0:00:00.407) 0:20:08.714 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:23:47 -0400 (0:00:00.236) 0:20:08.951 ********** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:23:48 -0400 (0:00:00.243) 0:20:09.194 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 05:23:48 -0400 (0:00:00.361) 0:20:09.556 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 05:23:48 -0400 (0:00:00.157) 0:20:09.714 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 05:23:48 -0400 (0:00:00.208) 0:20:09.922 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 05:23:49 -0400 (0:00:00.242) 0:20:10.165 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 05:23:49 -0400 (0:00:00.249) 0:20:10.414 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 05:23:49 -0400 (0:00:00.239) 0:20:10.654 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:23:49 -0400 (0:00:00.163) 0:20:10.817 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:23:49 -0400 (0:00:00.175) 0:20:10.993 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:23:50 -0400 (0:00:00.511) 0:20:11.504 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 05:23:50 -0400 (0:00:00.329) 0:20:11.833 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 05:23:50 -0400 (0:00:00.186) 0:20:12.020 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 05:23:51 -0400 (0:00:00.211) 0:20:12.231 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 05:23:51 -0400 (0:00:00.297) 0:20:12.529 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 05:23:51 -0400 (0:00:00.320) 0:20:12.849 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 05:23:51 -0400 (0:00:00.204) 0:20:13.054 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 05:23:52 -0400 (0:00:00.326) 0:20:13.381 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:23:52 -0400 (0:00:00.214) 0:20:13.596 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:23:52 -0400 (0:00:00.517) 0:20:14.113 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:23:53 -0400 (0:00:00.200) 0:20:14.313 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:23:53 -0400 (0:00:00.250) 0:20:14.564 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:23:53 -0400 (0:00:00.200) 0:20:14.765 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:23:53 -0400 (0:00:00.258) 0:20:15.023 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:23:54 -0400 (0:00:00.226) 0:20:15.249 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:23:54 -0400 (0:00:00.193) 0:20:15.443 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:23:54 -0400 (0:00:00.183) 0:20:15.626 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:23:54 -0400 (0:00:00.279) 0:20:15.906 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:23:55 -0400 (0:00:00.427) 0:20:16.333 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:23:55 -0400 (0:00:00.223) 0:20:16.557 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:23:56 -0400 (0:00:01.385) 0:20:17.942 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:23:57 -0400 (0:00:00.630) 0:20:18.573 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:23:58 -0400 (0:00:00.728) 0:20:19.301 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:23:58 -0400 (0:00:00.386) 0:20:19.688 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:23:58 -0400 (0:00:00.262) 0:20:19.950 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:23:59 -0400 (0:00:00.392) 0:20:20.342 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:23:59 -0400 (0:00:00.333) 0:20:20.676 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:23:59 -0400 (0:00:00.298) 0:20:20.974 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:24:00 -0400 (0:00:00.295) 0:20:21.270 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:24:00 -0400 (0:00:00.288) 0:20:21.559 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:24:00 -0400 (0:00:00.358) 0:20:21.917 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:24:00 -0400 (0:00:00.146) 0:20:22.063 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:24:01 -0400 (0:00:00.333) 0:20:22.396 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:24:01 -0400 (0:00:00.246) 0:20:22.643 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:24:01 -0400 (0:00:00.224) 0:20:22.868 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:24:01 -0400 (0:00:00.224) 0:20:23.092 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:24:02 -0400 (0:00:00.196) 0:20:23.289 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:24:02 -0400 (0:00:00.166) 0:20:23.456 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:24:02 -0400 (0:00:00.212) 0:20:23.668 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:24:02 -0400 (0:00:00.179) 0:20:23.848 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676984.6384082, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776676984.6384082, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 300544, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776676984.6384082, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:24:03 -0400 (0:00:00.919) 0:20:24.767 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:24:03 -0400 (0:00:00.089) 0:20:24.857 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:24:03 -0400 (0:00:00.096) 0:20:24.953 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:24:04 -0400 (0:00:00.292) 0:20:25.246 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:24:04 -0400 (0:00:00.215) 0:20:25.461 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:24:04 -0400 (0:00:00.180) 0:20:25.642 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:24:04 -0400 (0:00:00.149) 0:20:25.792 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:24:04 -0400 (0:00:00.243) 0:20:26.036 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:24:09 -0400 (0:00:04.437) 0:20:30.474 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:24:09 -0400 (0:00:00.085) 0:20:30.559 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:24:09 -0400 (0:00:00.152) 0:20:30.712 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:24:09 -0400 (0:00:00.151) 0:20:30.863 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:24:09 -0400 (0:00:00.161) 0:20:31.024 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:24:10 -0400 (0:00:00.181) 0:20:31.205 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:24:10 -0400 (0:00:00.073) 0:20:31.279 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:24:10 -0400 (0:00:00.173) 0:20:31.452 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:24:10 -0400 (0:00:00.208) 0:20:31.661 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:24:10 -0400 (0:00:00.195) 0:20:31.856 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:24:10 -0400 (0:00:00.135) 0:20:31.991 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:24:10 -0400 (0:00:00.121) 0:20:32.113 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:24:11 -0400 (0:00:00.110) 0:20:32.224 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:24:11 -0400 (0:00:00.070) 0:20:32.294 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:24:11 -0400 (0:00:00.110) 0:20:32.405 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:24:11 -0400 (0:00:00.119) 0:20:32.525 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:24:11 -0400 (0:00:00.110) 0:20:32.635 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:24:11 -0400 (0:00:00.167) 0:20:32.803 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:24:11 -0400 (0:00:00.115) 0:20:32.919 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:24:11 -0400 (0:00:00.165) 0:20:33.084 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:24:12 -0400 (0:00:00.146) 0:20:33.231 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:24:12 -0400 (0:00:00.119) 0:20:33.351 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:24:12 -0400 (0:00:00.091) 0:20:33.442 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:24:12 -0400 (0:00:00.219) 0:20:33.662 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:24:12 -0400 (0:00:00.112) 0:20:33.774 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:24:13 -0400 (0:00:01.150) 0:20:34.925 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:24:14 -0400 (0:00:01.181) 0:20:36.106 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:24:15 -0400 (0:00:00.145) 0:20:36.251 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:24:15 -0400 (0:00:00.133) 0:20:36.384 ********** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:24:16 -0400 (0:00:01.248) 0:20:37.633 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:24:16 -0400 (0:00:00.135) 0:20:37.768 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:24:16 -0400 (0:00:00.152) 0:20:37.920 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:24:16 -0400 (0:00:00.117) 0:20:38.038 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:24:17 -0400 (0:00:00.144) 0:20:38.182 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:24:17 -0400 (0:00:00.169) 0:20:38.351 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:24:17 -0400 (0:00:00.100) 0:20:38.451 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:24:17 -0400 (0:00:00.098) 0:20:38.550 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:24:17 -0400 (0:00:00.171) 0:20:38.721 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:24:17 -0400 (0:00:00.162) 0:20:38.884 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:24:17 -0400 (0:00:00.073) 0:20:38.957 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:24:17 -0400 (0:00:00.065) 0:20:39.022 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:24:17 -0400 (0:00:00.061) 0:20:39.084 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:24:18 -0400 (0:00:00.066) 0:20:39.150 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:24:18 -0400 (0:00:00.085) 0:20:39.236 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:24:18 -0400 (0:00:00.151) 0:20:39.388 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:24:18 -0400 (0:00:00.146) 0:20:39.534 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:24:18 -0400 (0:00:00.163) 0:20:39.698 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:24:18 -0400 (0:00:00.091) 0:20:39.789 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:24:18 -0400 (0:00:00.134) 0:20:39.923 ********** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:24:18 -0400 (0:00:00.107) 0:20:40.031 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:24:18 -0400 (0:00:00.074) 0:20:40.105 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:24:19 -0400 (0:00:00.117) 0:20:40.222 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.027117", "end": "2026-04-20 05:24:19.924027", "rc": 0, "start": "2026-04-20 05:24:19.896910" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:24:20 -0400 (0:00:00.994) 0:20:41.217 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:24:20 -0400 (0:00:00.279) 0:20:41.496 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:24:20 -0400 (0:00:00.080) 0:20:41.577 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:24:20 -0400 (0:00:00.094) 0:20:41.672 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:24:20 -0400 (0:00:00.142) 0:20:41.815 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:24:20 -0400 (0:00:00.142) 0:20:41.957 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:24:20 -0400 (0:00:00.152) 0:20:42.110 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:24:21 -0400 (0:00:00.115) 0:20:42.225 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:24:21 -0400 (0:00:00.205) 0:20:42.431 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 05:24:21 -0400 (0:00:00.185) 0:20:42.616 ********** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:473 Monday 20 April 2026 05:24:22 -0400 (0:00:01.185) 0:20:43.802 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 05:24:23 -0400 (0:00:00.442) 0:20:44.244 ********** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 05:24:23 -0400 (0:00:00.138) 0:20:44.383 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:24:23 -0400 (0:00:00.271) 0:20:44.654 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:24:23 -0400 (0:00:00.217) 0:20:44.872 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:24:23 -0400 (0:00:00.117) 0:20:44.989 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:24:25 -0400 (0:00:01.316) 0:20:46.306 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:24:25 -0400 (0:00:00.192) 0:20:46.498 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:24:27 -0400 (0:00:01.693) 0:20:48.191 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:24:27 -0400 (0:00:00.235) 0:20:48.427 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:24:27 -0400 (0:00:00.075) 0:20:48.503 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:24:27 -0400 (0:00:00.069) 0:20:48.572 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:24:27 -0400 (0:00:00.127) 0:20:48.699 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:24:27 -0400 (0:00:00.114) 0:20:48.814 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:24:28 -0400 (0:00:00.329) 0:20:49.144 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:24:28 -0400 (0:00:00.125) 0:20:49.269 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:24:28 -0400 (0:00:00.127) 0:20:49.397 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:24:31 -0400 (0:00:03.436) 0:20:52.833 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:24:31 -0400 (0:00:00.187) 0:20:53.021 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:24:31 -0400 (0:00:00.109) 0:20:53.130 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:24:37 -0400 (0:00:05.100) 0:20:58.231 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:24:37 -0400 (0:00:00.213) 0:20:58.444 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:24:37 -0400 (0:00:00.140) 0:20:58.584 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:24:37 -0400 (0:00:00.125) 0:20:58.710 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:24:37 -0400 (0:00:00.097) 0:20:58.807 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:24:41 -0400 (0:00:03.783) 0:21:02.591 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service": { "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:24:43 -0400 (0:00:02.517) 0:21:05.109 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-8fc43477-33fc-4cea-9240-68fd98da42e7", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-8fc43477-33fc-4cea-9240-68fd98da42e7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 05:20:33 EDT", "StateChangeTimestampMonotonic": "2771568173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:24:46 -0400 (0:00:02.929) 0:21:08.038 ********** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 05:24:51 -0400 (0:00:05.072) 0:21:13.111 ********** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:24:52 -0400 (0:00:00.167) 0:21:13.279 ********** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d8fc43477\x2d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d8fc43477\\x2d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d33fc\x2d4cea\x2d9240\x2d68fd98da42e7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "name": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d33fc\\x2d4cea\\x2d9240\\x2d68fd98da42e7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 05:24:54 -0400 (0:00:02.356) 0:21:15.635 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 05:24:54 -0400 (0:00:00.125) 0:21:15.760 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 05:24:54 -0400 (0:00:00.184) 0:21:15.945 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 05:24:55 -0400 (0:00:00.259) 0:21:16.204 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677062.4852335, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776677062.4852335, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776677062.4852335, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3521963196", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 05:24:56 -0400 (0:00:01.182) 0:21:17.386 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:497 Monday 20 April 2026 05:24:56 -0400 (0:00:00.092) 0:21:17.479 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:24:56 -0400 (0:00:00.209) 0:21:17.688 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:24:56 -0400 (0:00:00.113) 0:21:17.801 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:24:56 -0400 (0:00:00.155) 0:21:17.957 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:24:57 -0400 (0:00:00.971) 0:21:18.928 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:24:57 -0400 (0:00:00.108) 0:21:19.037 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:24:59 -0400 (0:00:01.431) 0:21:20.468 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:24:59 -0400 (0:00:00.519) 0:21:20.988 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:24:59 -0400 (0:00:00.138) 0:21:21.127 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:25:00 -0400 (0:00:00.137) 0:21:21.264 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:25:00 -0400 (0:00:00.511) 0:21:21.776 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:25:00 -0400 (0:00:00.074) 0:21:21.851 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:25:00 -0400 (0:00:00.253) 0:21:22.104 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:25:01 -0400 (0:00:00.062) 0:21:22.167 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:25:01 -0400 (0:00:00.040) 0:21:22.208 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:25:04 -0400 (0:00:03.530) 0:21:25.738 ********** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:25:04 -0400 (0:00:00.080) 0:21:25.819 ********** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:25:04 -0400 (0:00:00.077) 0:21:25.896 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:25:09 -0400 (0:00:04.849) 0:21:30.745 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:25:09 -0400 (0:00:00.143) 0:21:30.889 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:25:09 -0400 (0:00:00.108) 0:21:30.997 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:25:09 -0400 (0:00:00.101) 0:21:31.099 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:25:10 -0400 (0:00:00.056) 0:21:31.155 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:25:13 -0400 (0:00:03.406) 0:21:34.562 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:25:15 -0400 (0:00:02.459) 0:21:37.021 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:25:16 -0400 (0:00:00.233) 0:21:37.255 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:25:29 -0400 (0:00:13.309) 0:21:50.577 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:25:29 -0400 (0:00:00.144) 0:21:50.722 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776676995.9233828, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1776676995.9203827, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776676995.9203827, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:25:30 -0400 (0:00:01.411) 0:21:52.133 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:25:32 -0400 (0:00:01.455) 0:21:53.589 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:25:32 -0400 (0:00:00.347) 0:21:53.936 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:25:32 -0400 (0:00:00.203) 0:21:54.139 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:25:33 -0400 (0:00:00.213) 0:21:54.353 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:25:33 -0400 (0:00:00.276) 0:21:54.630 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:25:35 -0400 (0:00:01.648) 0:21:56.278 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:25:36 -0400 (0:00:01.691) 0:21:57.969 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:25:38 -0400 (0:00:01.420) 0:21:59.390 ********** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:25:38 -0400 (0:00:00.248) 0:21:59.638 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:25:39 -0400 (0:00:01.416) 0:22:01.055 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677010.6433496, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776677001.4023705, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 8388807, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776677001.4013705, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "887266473", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:25:41 -0400 (0:00:01.223) 0:22:02.278 ********** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-1141d0ba-5865-4d42-80f2-7df886666bbe', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:25:42 -0400 (0:00:01.393) 0:22:03.672 ********** ok: [managed-node2] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:513 Monday 20 April 2026 05:25:44 -0400 (0:00:01.536) 0:22:05.209 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:25:44 -0400 (0:00:00.213) 0:22:05.422 ********** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:25:44 -0400 (0:00:00.144) 0:22:05.567 ********** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:25:44 -0400 (0:00:00.115) 0:22:05.683 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1141d0ba-5865-4d42-80f2-7df886666bbe" }, "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "size": "4G", "type": "crypt", "uuid": "3dbf990b-2a35-4d1c-a794-f6b84b862359" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:25:45 -0400 (0:00:01.220) 0:22:06.903 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002611", "end": "2026-04-20 05:25:46.777564", "rc": 0, "start": "2026-04-20 05:25:46.774953" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:25:47 -0400 (0:00:01.240) 0:22:08.144 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003096", "end": "2026-04-20 05:25:48.086345", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:25:48.083249" } STDOUT: luks-1141d0ba-5865-4d42-80f2-7df886666bbe /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:25:48 -0400 (0:00:01.343) 0:22:09.488 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 05:25:48 -0400 (0:00:00.280) 0:22:09.768 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 05:25:48 -0400 (0:00:00.119) 0:22:09.888 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023934", "end": "2026-04-20 05:25:49.889583", "rc": 0, "start": "2026-04-20 05:25:49.865649" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 05:25:49 -0400 (0:00:01.250) 0:22:11.139 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 05:25:50 -0400 (0:00:00.172) 0:22:11.311 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 05:25:50 -0400 (0:00:00.276) 0:22:11.588 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 05:25:50 -0400 (0:00:00.151) 0:22:11.739 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 05:25:51 -0400 (0:00:01.270) 0:22:13.010 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 05:25:51 -0400 (0:00:00.113) 0:22:13.123 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 05:25:52 -0400 (0:00:00.161) 0:22:13.284 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 05:25:52 -0400 (0:00:00.206) 0:22:13.491 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 05:25:52 -0400 (0:00:00.159) 0:22:13.650 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 05:25:52 -0400 (0:00:00.225) 0:22:13.876 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 05:25:52 -0400 (0:00:00.201) 0:22:14.078 ********** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 05:25:53 -0400 (0:00:00.306) 0:22:14.385 ********** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.110 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 05:25:54 -0400 (0:00:01.242) 0:22:15.627 ********** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 05:25:54 -0400 (0:00:00.111) 0:22:15.739 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 05:25:55 -0400 (0:00:00.407) 0:22:16.146 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 05:25:55 -0400 (0:00:00.213) 0:22:16.360 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 05:25:55 -0400 (0:00:00.257) 0:22:16.617 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 05:25:55 -0400 (0:00:00.231) 0:22:16.850 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 05:25:55 -0400 (0:00:00.120) 0:22:16.970 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 05:25:56 -0400 (0:00:00.239) 0:22:17.209 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 05:25:56 -0400 (0:00:00.191) 0:22:17.401 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 05:25:56 -0400 (0:00:00.123) 0:22:17.525 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 05:25:56 -0400 (0:00:00.150) 0:22:17.675 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 05:25:56 -0400 (0:00:00.179) 0:22:17.854 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 05:25:56 -0400 (0:00:00.202) 0:22:18.057 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 05:25:57 -0400 (0:00:00.161) 0:22:18.219 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 05:25:57 -0400 (0:00:00.344) 0:22:18.563 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 05:25:57 -0400 (0:00:00.236) 0:22:18.800 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 05:25:57 -0400 (0:00:00.134) 0:22:18.935 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 05:25:57 -0400 (0:00:00.199) 0:22:19.134 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 05:25:58 -0400 (0:00:00.248) 0:22:19.383 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 05:25:58 -0400 (0:00:00.214) 0:22:19.597 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 05:25:58 -0400 (0:00:00.310) 0:22:19.907 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 05:25:59 -0400 (0:00:00.297) 0:22:20.205 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 05:25:59 -0400 (0:00:00.339) 0:22:20.545 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 05:25:59 -0400 (0:00:00.534) 0:22:21.080 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 05:26:00 -0400 (0:00:00.211) 0:22:21.292 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 05:26:00 -0400 (0:00:00.183) 0:22:21.475 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 05:26:00 -0400 (0:00:00.267) 0:22:21.743 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 05:26:00 -0400 (0:00:00.253) 0:22:21.997 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 05:26:01 -0400 (0:00:00.311) 0:22:22.309 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 05:26:01 -0400 (0:00:00.410) 0:22:22.719 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 05:26:02 -0400 (0:00:00.798) 0:22:23.517 ********** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 05:26:02 -0400 (0:00:00.258) 0:22:23.775 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 05:26:02 -0400 (0:00:00.331) 0:22:24.107 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 05:26:03 -0400 (0:00:00.286) 0:22:24.393 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 05:26:03 -0400 (0:00:00.237) 0:22:24.631 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 05:26:03 -0400 (0:00:00.223) 0:22:24.855 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 05:26:03 -0400 (0:00:00.260) 0:22:25.116 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 05:26:04 -0400 (0:00:00.246) 0:22:25.362 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 05:26:04 -0400 (0:00:00.119) 0:22:25.482 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 05:26:04 -0400 (0:00:00.089) 0:22:25.571 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 05:26:04 -0400 (0:00:00.438) 0:22:26.009 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 05:26:05 -0400 (0:00:00.298) 0:22:26.308 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 05:26:05 -0400 (0:00:00.258) 0:22:26.566 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 05:26:05 -0400 (0:00:00.210) 0:22:26.777 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 05:26:05 -0400 (0:00:00.245) 0:22:27.022 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 05:26:05 -0400 (0:00:00.103) 0:22:27.126 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 05:26:06 -0400 (0:00:00.301) 0:22:27.428 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 05:26:06 -0400 (0:00:00.209) 0:22:27.638 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 05:26:06 -0400 (0:00:00.209) 0:22:27.847 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 05:26:07 -0400 (0:00:00.448) 0:22:28.296 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 05:26:07 -0400 (0:00:00.246) 0:22:28.542 ********** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 05:26:07 -0400 (0:00:00.191) 0:22:28.733 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 05:26:07 -0400 (0:00:00.216) 0:22:28.950 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 05:26:08 -0400 (0:00:00.256) 0:22:29.207 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 05:26:08 -0400 (0:00:00.185) 0:22:29.392 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 05:26:08 -0400 (0:00:00.195) 0:22:29.587 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 05:26:08 -0400 (0:00:00.178) 0:22:29.766 ********** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 05:26:08 -0400 (0:00:00.223) 0:22:29.990 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:26:09 -0400 (0:00:00.302) 0:22:30.292 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:26:09 -0400 (0:00:00.344) 0:22:30.637 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:26:10 -0400 (0:00:01.041) 0:22:31.678 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:26:10 -0400 (0:00:00.127) 0:22:31.806 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:26:11 -0400 (0:00:00.346) 0:22:32.153 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:26:11 -0400 (0:00:00.206) 0:22:32.359 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:26:11 -0400 (0:00:00.125) 0:22:32.484 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:26:11 -0400 (0:00:00.108) 0:22:32.593 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:26:11 -0400 (0:00:00.178) 0:22:32.771 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:26:11 -0400 (0:00:00.170) 0:22:32.942 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:26:11 -0400 (0:00:00.157) 0:22:33.099 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:26:12 -0400 (0:00:00.320) 0:22:33.420 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:26:12 -0400 (0:00:00.189) 0:22:33.610 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:26:12 -0400 (0:00:00.120) 0:22:33.731 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:26:12 -0400 (0:00:00.288) 0:22:34.020 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:26:13 -0400 (0:00:00.134) 0:22:34.154 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:26:13 -0400 (0:00:00.211) 0:22:34.366 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:26:13 -0400 (0:00:00.163) 0:22:34.529 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:26:13 -0400 (0:00:00.257) 0:22:34.787 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:26:13 -0400 (0:00:00.167) 0:22:34.954 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:26:14 -0400 (0:00:00.346) 0:22:35.301 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:26:14 -0400 (0:00:00.290) 0:22:35.592 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677128.9790843, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776677128.9790843, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 300544, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776677128.9790843, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:26:15 -0400 (0:00:01.498) 0:22:37.090 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:26:16 -0400 (0:00:00.213) 0:22:37.304 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:26:16 -0400 (0:00:00.308) 0:22:37.613 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:26:16 -0400 (0:00:00.182) 0:22:37.796 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:26:16 -0400 (0:00:00.285) 0:22:38.081 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:26:17 -0400 (0:00:00.216) 0:22:38.298 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:26:17 -0400 (0:00:00.147) 0:22:38.446 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677129.124084, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776677129.124084, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 319703, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776677129.124084, "nlink": 1, "path": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:26:18 -0400 (0:00:01.268) 0:22:39.715 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:26:22 -0400 (0:00:04.218) 0:22:43.933 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010532", "end": "2026-04-20 05:26:23.987468", "rc": 0, "start": "2026-04-20 05:26:23.976936" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 1141d0ba-5865-4d42-80f2-7df886666bbe Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: cb 69 ff b3 2b 44 8d a6 b4 a2 6e 8d a4 a5 86 b0 a2 e7 ef e9 78 5f cd 93 85 f8 2e 04 40 92 32 4d AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 6d ed 97 5f 89 a1 b1 dc ae f4 fd 0d f6 9e 86 37 8c aa 42 c2 5f 14 89 17 06 a6 82 94 f6 de 1b fc Digest: 0f f8 6e 72 f1 1d 4c a3 1b 01 8d 3c bf 23 60 b2 87 91 2d 39 ae 22 fc 00 ab da 4e 69 d2 43 34 22 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:26:24 -0400 (0:00:01.482) 0:22:45.416 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:26:24 -0400 (0:00:00.281) 0:22:45.697 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:26:24 -0400 (0:00:00.286) 0:22:45.983 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:26:25 -0400 (0:00:00.182) 0:22:46.166 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:26:25 -0400 (0:00:00.146) 0:22:46.312 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:26:25 -0400 (0:00:00.269) 0:22:46.582 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:26:25 -0400 (0:00:00.195) 0:22:46.777 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:26:25 -0400 (0:00:00.198) 0:22:46.976 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-1141d0ba-5865-4d42-80f2-7df886666bbe /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:26:25 -0400 (0:00:00.122) 0:22:47.098 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:26:26 -0400 (0:00:00.122) 0:22:47.221 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:26:26 -0400 (0:00:00.163) 0:22:47.384 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:26:26 -0400 (0:00:00.225) 0:22:47.609 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:26:26 -0400 (0:00:00.177) 0:22:47.786 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:26:26 -0400 (0:00:00.162) 0:22:47.949 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:26:26 -0400 (0:00:00.094) 0:22:48.043 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:26:27 -0400 (0:00:00.121) 0:22:48.164 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:26:27 -0400 (0:00:00.090) 0:22:48.255 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:26:27 -0400 (0:00:00.090) 0:22:48.346 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:26:27 -0400 (0:00:00.180) 0:22:48.527 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:26:27 -0400 (0:00:00.117) 0:22:48.644 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:26:27 -0400 (0:00:00.158) 0:22:48.803 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:26:27 -0400 (0:00:00.220) 0:22:49.024 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:26:28 -0400 (0:00:00.130) 0:22:49.155 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:26:28 -0400 (0:00:00.150) 0:22:49.305 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:26:29 -0400 (0:00:01.237) 0:22:50.543 ********** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:26:30 -0400 (0:00:00.966) 0:22:51.510 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:26:30 -0400 (0:00:00.225) 0:22:51.735 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:26:30 -0400 (0:00:00.216) 0:22:51.951 ********** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:26:31 -0400 (0:00:01.049) 0:22:53.000 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:26:31 -0400 (0:00:00.083) 0:22:53.084 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:26:32 -0400 (0:00:00.114) 0:22:53.198 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:26:32 -0400 (0:00:00.128) 0:22:53.326 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:26:32 -0400 (0:00:00.146) 0:22:53.473 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:26:32 -0400 (0:00:00.165) 0:22:53.638 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:26:32 -0400 (0:00:00.163) 0:22:53.801 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:26:32 -0400 (0:00:00.210) 0:22:54.012 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:26:33 -0400 (0:00:00.174) 0:22:54.187 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:26:33 -0400 (0:00:00.191) 0:22:54.379 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:26:33 -0400 (0:00:00.081) 0:22:54.461 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:26:33 -0400 (0:00:00.126) 0:22:54.587 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:26:33 -0400 (0:00:00.199) 0:22:54.787 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:26:33 -0400 (0:00:00.151) 0:22:54.938 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:26:34 -0400 (0:00:00.265) 0:22:55.204 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:26:34 -0400 (0:00:00.209) 0:22:55.413 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:26:34 -0400 (0:00:00.193) 0:22:55.607 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:26:34 -0400 (0:00:00.196) 0:22:55.804 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:26:34 -0400 (0:00:00.207) 0:22:56.011 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:26:35 -0400 (0:00:00.183) 0:22:56.195 ********** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:26:35 -0400 (0:00:00.199) 0:22:56.395 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:26:35 -0400 (0:00:00.260) 0:22:56.655 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:26:35 -0400 (0:00:00.274) 0:22:56.929 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023348", "end": "2026-04-20 05:26:36.659421", "rc": 0, "start": "2026-04-20 05:26:36.636073" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:26:36 -0400 (0:00:00.955) 0:22:57.885 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:26:36 -0400 (0:00:00.163) 0:22:58.048 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:26:37 -0400 (0:00:00.119) 0:22:58.167 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:26:37 -0400 (0:00:00.055) 0:22:58.223 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:26:37 -0400 (0:00:00.304) 0:22:58.527 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:26:37 -0400 (0:00:00.204) 0:22:58.732 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:26:37 -0400 (0:00:00.048) 0:22:58.780 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:26:37 -0400 (0:00:00.102) 0:22:58.883 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:26:37 -0400 (0:00:00.098) 0:22:58.982 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:516 Monday 20 April 2026 05:26:37 -0400 (0:00:00.147) 0:22:59.129 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node2 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 05:26:38 -0400 (0:00:00.541) 0:22:59.671 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 05:26:38 -0400 (0:00:00.149) 0:22:59.821 ********** TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 05:26:38 -0400 (0:00:00.284) 0:23:00.105 ********** ok: [managed-node2] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:6 Monday 20 April 2026 05:26:40 -0400 (0:00:01.568) 0:23:01.673 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 05:26:40 -0400 (0:00:00.164) 0:23:01.838 ********** ok: [managed-node2] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 05:26:42 -0400 (0:00:01.587) 0:23:03.425 ********** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Monday 20 April 2026 05:26:42 -0400 (0:00:00.552) 0:23:03.978 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Monday 20 April 2026 05:26:43 -0400 (0:00:00.312) 0:23:04.290 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 05:26:43 -0400 (0:00:00.324) 0:23:04.615 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 05:26:43 -0400 (0:00:00.196) 0:23:04.811 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 Monday 20 April 2026 05:26:43 -0400 (0:00:00.220) 0:23:05.032 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 05:26:44 -0400 (0:00:00.294) 0:23:05.326 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 05:26:44 -0400 (0:00:00.119) 0:23:05.445 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 05:26:44 -0400 (0:00:00.238) 0:23:05.684 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 05:26:48 -0400 (0:00:03.789) 0:23:09.473 ********** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 05:26:48 -0400 (0:00:00.055) 0:23:09.529 ********** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 05:26:48 -0400 (0:00:00.063) 0:23:09.593 ********** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 05:26:53 -0400 (0:00:05.129) 0:23:14.723 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 05:26:53 -0400 (0:00:00.277) 0:23:15.000 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 05:26:54 -0400 (0:00:00.157) 0:23:15.158 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 05:26:54 -0400 (0:00:00.242) 0:23:15.400 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 05:26:54 -0400 (0:00:00.104) 0:23:15.505 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 05:26:58 -0400 (0:00:03.872) 0:23:19.377 ********** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 05:27:01 -0400 (0:00:02.838) 0:23:22.215 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 05:27:01 -0400 (0:00:00.391) 0:23:22.607 ********** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 05:27:07 -0400 (0:00:06.164) 0:23:28.772 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 05:27:07 -0400 (0:00:00.223) 0:23:28.995 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677138.018064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5f51f37e6263abfde83d149a6a0ca6f53f47fa59", "ctime": 1776677138.015064, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 461373573, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776677138.015064, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2863117270", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 05:27:09 -0400 (0:00:01.420) 0:23:30.416 ********** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 05:27:10 -0400 (0:00:01.497) 0:23:31.913 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 05:27:11 -0400 (0:00:00.313) 0:23:32.227 ********** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 05:27:11 -0400 (0:00:00.269) 0:23:32.497 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 05:27:11 -0400 (0:00:00.256) 0:23:32.753 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 05:27:11 -0400 (0:00:00.254) 0:23:33.008 ********** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1141d0ba-5865-4d42-80f2-7df886666bbe" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 05:27:13 -0400 (0:00:01.420) 0:23:34.429 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 05:27:15 -0400 (0:00:01.889) 0:23:36.318 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 05:27:15 -0400 (0:00:00.233) 0:23:36.552 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 05:27:15 -0400 (0:00:00.262) 0:23:36.814 ********** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 05:27:17 -0400 (0:00:02.086) 0:23:38.901 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677148.0850415, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "265fd1472633334bc5efcfcc9b612b3bf8c438ee", "ctime": 1776677142.3200543, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 180355273, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776677142.3190544, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3772907319", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 05:27:19 -0400 (0:00:01.368) 0:23:40.269 ********** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-1141d0ba-5865-4d42-80f2-7df886666bbe', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1141d0ba-5865-4d42-80f2-7df886666bbe", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 05:27:20 -0400 (0:00:01.344) 0:23:41.613 ********** ok: [managed-node2] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:525 Monday 20 April 2026 05:27:22 -0400 (0:00:02.075) 0:23:43.689 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 05:27:23 -0400 (0:00:00.520) 0:23:44.210 ********** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 05:27:23 -0400 (0:00:00.136) 0:23:44.347 ********** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=yfZyjj-Y4R9-sVvm-mPjh-hGC0-QSHA-lX0H1I", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 05:27:23 -0400 (0:00:00.231) 0:23:44.578 ********** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 05:27:24 -0400 (0:00:01.302) 0:23:45.881 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002908", "end": "2026-04-20 05:27:26.052449", "rc": 0, "start": "2026-04-20 05:27:26.049541" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 05:27:26 -0400 (0:00:01.533) 0:23:47.414 ********** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002623", "end": "2026-04-20 05:27:27.421510", "failed_when_result": false, "rc": 0, "start": "2026-04-20 05:27:27.418887" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 05:27:27 -0400 (0:00:01.370) 0:23:48.785 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 05:27:27 -0400 (0:00:00.133) 0:23:48.919 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 05:27:28 -0400 (0:00:00.271) 0:23:49.191 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 05:27:28 -0400 (0:00:00.184) 0:23:49.375 ********** included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 05:27:29 -0400 (0:00:00.930) 0:23:50.306 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 05:27:29 -0400 (0:00:00.180) 0:23:50.486 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 05:27:29 -0400 (0:00:00.205) 0:23:50.692 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 05:27:29 -0400 (0:00:00.228) 0:23:50.920 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 05:27:29 -0400 (0:00:00.158) 0:23:51.079 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 05:27:30 -0400 (0:00:00.179) 0:23:51.259 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 05:27:30 -0400 (0:00:00.178) 0:23:51.437 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 05:27:30 -0400 (0:00:00.127) 0:23:51.564 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 05:27:30 -0400 (0:00:00.319) 0:23:51.884 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 05:27:30 -0400 (0:00:00.165) 0:23:52.050 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 05:27:31 -0400 (0:00:00.182) 0:23:52.232 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 05:27:31 -0400 (0:00:00.137) 0:23:52.369 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 05:27:31 -0400 (0:00:00.319) 0:23:52.689 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 05:27:31 -0400 (0:00:00.230) 0:23:52.919 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 05:27:31 -0400 (0:00:00.182) 0:23:53.102 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 05:27:32 -0400 (0:00:00.179) 0:23:53.282 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 05:27:32 -0400 (0:00:00.306) 0:23:53.588 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 05:27:32 -0400 (0:00:00.172) 0:23:53.761 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 05:27:32 -0400 (0:00:00.282) 0:23:54.043 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 05:27:33 -0400 (0:00:00.247) 0:23:54.290 ********** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1776677227.2888637, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776677227.2888637, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 37259, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776677227.2888637, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 05:27:34 -0400 (0:00:01.074) 0:23:55.365 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 05:27:34 -0400 (0:00:00.245) 0:23:55.611 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 05:27:34 -0400 (0:00:00.274) 0:23:55.886 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 05:27:34 -0400 (0:00:00.142) 0:23:56.028 ********** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 05:27:35 -0400 (0:00:00.242) 0:23:56.270 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 05:27:35 -0400 (0:00:00.211) 0:23:56.482 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 05:27:35 -0400 (0:00:00.102) 0:23:56.585 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 05:27:35 -0400 (0:00:00.088) 0:23:56.673 ********** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 05:27:39 -0400 (0:00:04.109) 0:24:00.783 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 05:27:39 -0400 (0:00:00.213) 0:24:00.996 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 05:27:40 -0400 (0:00:00.237) 0:24:01.234 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 05:27:40 -0400 (0:00:00.144) 0:24:01.378 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 05:27:40 -0400 (0:00:00.220) 0:24:01.599 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 05:27:40 -0400 (0:00:00.184) 0:24:01.784 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 05:27:40 -0400 (0:00:00.175) 0:24:01.959 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 05:27:40 -0400 (0:00:00.147) 0:24:02.107 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 05:27:41 -0400 (0:00:00.129) 0:24:02.237 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 05:27:41 -0400 (0:00:00.239) 0:24:02.476 ********** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 05:27:41 -0400 (0:00:00.319) 0:24:02.795 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 05:27:41 -0400 (0:00:00.160) 0:24:02.955 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 05:27:41 -0400 (0:00:00.115) 0:24:03.071 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 05:27:42 -0400 (0:00:00.239) 0:24:03.311 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 05:27:42 -0400 (0:00:00.162) 0:24:03.473 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 05:27:42 -0400 (0:00:00.133) 0:24:03.606 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 05:27:42 -0400 (0:00:00.254) 0:24:03.861 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 05:27:42 -0400 (0:00:00.228) 0:24:04.089 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 05:27:43 -0400 (0:00:00.158) 0:24:04.248 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 05:27:43 -0400 (0:00:00.208) 0:24:04.457 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 05:27:43 -0400 (0:00:00.237) 0:24:04.694 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 05:27:43 -0400 (0:00:00.160) 0:24:04.855 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 05:27:43 -0400 (0:00:00.118) 0:24:04.973 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 05:27:43 -0400 (0:00:00.087) 0:24:05.061 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 05:27:44 -0400 (0:00:00.141) 0:24:05.202 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 05:27:44 -0400 (0:00:00.274) 0:24:05.477 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 05:27:44 -0400 (0:00:00.244) 0:24:05.722 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 05:27:44 -0400 (0:00:00.264) 0:24:05.986 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 05:27:45 -0400 (0:00:00.192) 0:24:06.179 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 05:27:45 -0400 (0:00:00.161) 0:24:06.340 ********** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 05:27:45 -0400 (0:00:00.195) 0:24:06.536 ********** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 05:27:45 -0400 (0:00:00.205) 0:24:06.742 ********** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 05:27:45 -0400 (0:00:00.119) 0:24:06.861 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 05:27:45 -0400 (0:00:00.236) 0:24:07.098 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 05:27:46 -0400 (0:00:00.093) 0:24:07.191 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 05:27:46 -0400 (0:00:00.159) 0:24:07.351 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 05:27:46 -0400 (0:00:00.121) 0:24:07.473 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 05:27:46 -0400 (0:00:00.089) 0:24:07.562 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 05:27:46 -0400 (0:00:00.096) 0:24:07.659 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 05:27:46 -0400 (0:00:00.089) 0:24:07.748 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 05:27:46 -0400 (0:00:00.074) 0:24:07.823 ********** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 05:27:46 -0400 (0:00:00.205) 0:24:08.029 ********** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 05:27:47 -0400 (0:00:00.218) 0:24:08.247 ********** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 05:27:47 -0400 (0:00:00.179) 0:24:08.427 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 05:27:47 -0400 (0:00:00.203) 0:24:08.630 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 05:27:47 -0400 (0:00:00.195) 0:24:08.826 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 05:27:47 -0400 (0:00:00.150) 0:24:08.977 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 05:27:47 -0400 (0:00:00.154) 0:24:09.131 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 05:27:48 -0400 (0:00:00.232) 0:24:09.364 ********** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 05:27:48 -0400 (0:00:00.119) 0:24:09.483 ********** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 05:27:48 -0400 (0:00:00.126) 0:24:09.610 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 05:27:48 -0400 (0:00:00.131) 0:24:09.741 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 05:27:48 -0400 (0:00:00.160) 0:24:09.902 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 05:27:48 -0400 (0:00:00.125) 0:24:10.027 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 05:27:49 -0400 (0:00:00.184) 0:24:10.212 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 05:27:49 -0400 (0:00:00.197) 0:24:10.409 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 05:27:49 -0400 (0:00:00.262) 0:24:10.672 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 05:27:49 -0400 (0:00:00.210) 0:24:10.883 ********** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 05:27:49 -0400 (0:00:00.227) 0:24:11.110 ********** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 05:27:50 -0400 (0:00:00.140) 0:24:11.250 ********** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node2 : ok=1271 changed=60 unreachable=0 failed=9 skipped=1110 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:04:58.495955+00:00Z", "host": "managed-node2", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-20T09:04:53.340432+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:04:58.652714+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:04:58.515305+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:07:02.080111+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b' in safe mode due to encryption removal", "start_time": "2026-04-20T09:06:56.805079+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:07:02.328167+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-00a7163b-2630-4914-9eff-8d6f78b6405b' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:07:02.097238+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:08:53.799186+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-20T09:08:48.220969+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:08:53.980028+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:08:53.812666+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:10:50.034930+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T09:10:44.640677+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:10:50.314041+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:10:50.042586+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:13:14.050860+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144' in safe mode due to encryption removal", "start_time": "2026-04-20T09:13:08.732384+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:13:14.287604+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-082dbd3e-5ddd-4519-a194-5805ce58d144' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:13:14.098640+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:15:34.078942+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-20T09:15:28.444168+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:15:34.353420+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:15:34.100153+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:17:59.906311+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T09:17:54.451377+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:18:00.179939+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:17:59.939225+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:22:26.089108+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7' in safe mode due to encryption removal", "start_time": "2026-04-20T09:22:20.812793+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:22:26.329589+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-8fc43477-33fc-4cea-9240-68fd98da42e7' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:22:26.168842+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:24:51.959782+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-20T09:24:46.899345+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T09:24:52.132635+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T09:24:51.972092+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Monday 20 April 2026 05:27:50 -0400 (0:00:00.235) 0:24:11.486 ********** =============================================================================== fedora.linux_system_roles.storage : Record storage role fingerprint in syslog -- 32.27s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 15.70s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 15.57s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.94s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.42s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.35s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.32s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 9.53s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:17 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.16s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.89s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.70s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.69s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.67s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.64s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.61s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.61s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.57s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.56s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.50s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.48s /tmp/collections-Z1n/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88